¿Cuál es el salario de los argentinos?¶

Introducción¶

Contexto empresarial. Analizaremos los datos de la Encuesta Permanente de Hogares, llevada a cabo por el INDEC (Instituto Nacional de Estadística y Censos). Las bases contienen una significativa cantidad de variables de hogar y personas para posibilitar el análisis de las principales características demográficas y socioeconómicas de la población argentina. Esta encuesta se realiza desde el año 2003, con frecuencia trimestral.

Problema comercial. Centraremos nuestro análisis en el salario de las personas. Únicamente emplearemos los datos correspondientes al dataset más reciente al momento de comenzar el trabajo (primer trimestre del año 2022), por dos razones. En primer lugar, para tener un volumen manejable de datos (de por sí esta sola base de datos tiene 177 columnas por más de 49 mil filas). Y en segundo lugar, porque nuestra intención es predecir el salario en base a una serie de variables que mencionaremos a continuación, pero sin tener en cuenta la inflación mensual/anual.

Contexto analítico. Se descargó el archivo Excel del sitio oficial del INDEC (https://www.indec.gob.ar/indec/web/Institucional-Indec-BasesDeDatos-1). En él, cada registro tiene un número de identicación (CODUSU), que permite relacionar una vivienda con los hogares y las personas que la componen.

Todos los miembros del hogar tienen el mismo CODUSU y NRO_HOGAR pero se diferencian por el número de COMPONENTE.

Dentro del contenido del enlace pegado arriba, también se encontrará la documentación donde se detalla el significado de cada columna del dataset.

Se realizarán las siguientes tareas:

  1. Leer, transformar y preparar los datos para su visualización.
  2. Realizar análisis y construir visualizaciones para identificar patrones en el conjunto de datos.
  3. Transformar y preparar los datos para su introducción a los modelos de Machine Learning, con el objetivo de predecir el salario de los trabajadores con el mínimo error posible.
  4. Selección del modelo de Machine Learning ganador y optimización de sus hiperparámetros, a fin de mejorar la predicción.

Importación de librerías

In [6]:
import plotly
plotly.offline.init_notebook_mode(connected=True) #Para poder visualizar los gráficos de plotly en formato html
In [7]:
#Importación de librería de funciones matemáticas 
import numpy as np

#Importación de librería de manipulación y análisis de datos
import pandas as pd

#Importación de librerías para elaboración de visualizaciones
import matplotlib as mpl
import matplotlib.pyplot as plt

import seaborn as sns

import plotly.express as px
from plotly import subplots
from plotly.subplots import make_subplots
import plotly.graph_objects as go

#Importación de librería para creación de iteradores
import itertools

#Para omitir mensajes de advertencia
import warnings
warnings.filterwarnings('ignore')

Instalación de paquetes de algoritmos de Machine Learning

Instalación del paquete de CatBoost

In [8]:
pip install catboost
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting catboost
  Downloading catboost-1.1.1-cp39-none-manylinux1_x86_64.whl (76.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 76.6/76.6 MB 14.5 MB/s eta 0:00:00
Requirement already satisfied: pandas>=0.24.0 in /usr/local/lib/python3.9/dist-packages (from catboost) (1.4.4)
Requirement already satisfied: plotly in /usr/local/lib/python3.9/dist-packages (from catboost) (5.5.0)
Requirement already satisfied: scipy in /usr/local/lib/python3.9/dist-packages (from catboost) (1.10.1)
Requirement already satisfied: graphviz in /usr/local/lib/python3.9/dist-packages (from catboost) (0.10.1)
Requirement already satisfied: numpy>=1.16.0 in /usr/local/lib/python3.9/dist-packages (from catboost) (1.22.4)
Requirement already satisfied: six in /usr/local/lib/python3.9/dist-packages (from catboost) (1.15.0)
Requirement already satisfied: matplotlib in /usr/local/lib/python3.9/dist-packages (from catboost) (3.5.3)
Requirement already satisfied: python-dateutil>=2.8.1 in /usr/local/lib/python3.9/dist-packages (from pandas>=0.24.0->catboost) (2.8.2)
Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.9/dist-packages (from pandas>=0.24.0->catboost) (2022.7.1)
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.9/dist-packages (from matplotlib->catboost) (0.11.0)
Requirement already satisfied: pyparsing>=2.2.1 in /usr/local/lib/python3.9/dist-packages (from matplotlib->catboost) (3.0.9)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.9/dist-packages (from matplotlib->catboost) (1.4.4)
Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.9/dist-packages (from matplotlib->catboost) (23.0)
Requirement already satisfied: pillow>=6.2.0 in /usr/local/lib/python3.9/dist-packages (from matplotlib->catboost) (8.4.0)
Requirement already satisfied: fonttools>=4.22.0 in /usr/local/lib/python3.9/dist-packages (from matplotlib->catboost) (4.39.0)
Requirement already satisfied: tenacity>=6.2.0 in /usr/local/lib/python3.9/dist-packages (from plotly->catboost) (8.2.2)
Installing collected packages: catboost
Successfully installed catboost-1.1.1

Instalación del paquete de LightGBM

In [9]:
pip install lightgbm
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Requirement already satisfied: lightgbm in /usr/local/lib/python3.9/dist-packages (2.2.3)
Requirement already satisfied: numpy in /usr/local/lib/python3.9/dist-packages (from lightgbm) (1.22.4)
Requirement already satisfied: scikit-learn in /usr/local/lib/python3.9/dist-packages (from lightgbm) (1.2.2)
Requirement already satisfied: scipy in /usr/local/lib/python3.9/dist-packages (from lightgbm) (1.10.1)
Requirement already satisfied: joblib>=1.1.1 in /usr/local/lib/python3.9/dist-packages (from scikit-learn->lightgbm) (1.1.1)
Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.9/dist-packages (from scikit-learn->lightgbm) (3.1.0)

Instalación del paquete de XGBoost

In [10]:
pip install xgboost
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Requirement already satisfied: xgboost in /usr/local/lib/python3.9/dist-packages (1.7.4)
Requirement already satisfied: scipy in /usr/local/lib/python3.9/dist-packages (from xgboost) (1.10.1)
Requirement already satisfied: numpy in /usr/local/lib/python3.9/dist-packages (from xgboost) (1.22.4)

Importación de algoritmos y métricas de evaluación

In [11]:
#Para eliminación de outliers
from sklearn.ensemble import IsolationForest

#Importación de Algoritmos
from sklearn.model_selection import train_test_split
from sklearn.model_selection import cross_validate
from sklearn.model_selection import cross_val_score
from sklearn.linear_model import LinearRegression
from sklearn.ensemble import GradientBoostingRegressor
from sklearn.ensemble import HistGradientBoostingRegressor
from catboost import CatBoostRegressor
import lightgbm as lgb
from xgboost import XGBRegressor

#Importación de Métricas
from sklearn.metrics import mean_squared_error
from sklearn.metrics import r2_score
from sklearn.metrics import mean_absolute_error

Overview del dataset¶

Importación del dataset

In [12]:
#Lectura de archivo excel
df= pd.read_excel('usu_individual_T122.xlsx')
df.head(5)
Out[12]:
CODUSU ANO4 TRIMESTRE NRO_HOGAR COMPONENTE H15 REGION MAS_500 AGLOMERADO PONDERA ... PDECIFR ADECIFR IPCF DECCFR IDECCFR RDECCFR GDECCFR PDECCFR ADECCFR PONDIH
0 TQRMNOQXQHLOKQCDEGKDB00777573 2022 1 1 2 1 43 N 14 104 ... 10.0 10 77500.0 9 9.0 9 NaN 9.0 10 194
1 TQRMNOQXQHLOKQCDEGKDB00777573 2022 1 1 3 1 43 N 14 104 ... 10.0 10 77500.0 9 9.0 9 NaN 9.0 10 194
2 TQRMNOQXQHLOKQCDEGKDB00777573 2022 1 1 4 1 43 N 14 104 ... 10.0 10 77500.0 9 9.0 9 NaN 9.0 10 194
3 TQRMNOSUPHKKPQCDEIJAH00780151 2022 1 1 1 1 1 S 33 1741 ... NaN 12 0.0 12 NaN 12 12.0 NaN 12 0
4 TQRMNOSUPHKKPQCDEIJAH00780151 2022 1 1 2 1 1 S 33 1741 ... NaN 12 0.0 12 NaN 12 12.0 NaN 12 0

5 rows × 177 columns

Tamaño del dataset

In [13]:
#Tamaño del dataset
df.shape
Out[13]:
(49706, 177)

Vemos que nuestro conjunto de datos se compone de 49706 filas y 177 columnas.

Información sobre las columnas del dataset

In [14]:
df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 49706 entries, 0 to 49705
Columns: 177 entries, CODUSU to PONDIH
dtypes: float64(106), int64(66), object(5)
memory usage: 67.1+ MB

De las 177 columnas, tenemos 106 en formato de número decimal, 66 en formato de número entero y 5 columnas en formato objeto.

Aplicaremos el método .describe() de Pandas para obtener información de las columnas numéricas.

In [15]:
describe= df.describe()
describe2= describe.transpose()
describe2
Out[15]:
count mean std min 25% 50% 75% max
ANO4 49706.0 2022.000000 0.000000 2022.0 2022.00 2022.0 2022.0 2022.0
TRIMESTRE 49706.0 1.000000 0.000000 1.0 1.00 1.0 1.0 1.0
NRO_HOGAR 49706.0 1.041544 0.737641 1.0 1.00 1.0 1.0 52.0
COMPONENTE 49706.0 2.494387 1.620591 1.0 1.00 2.0 3.0 51.0
H15 49706.0 0.871907 0.338266 0.0 1.00 1.0 1.0 2.0
... ... ... ... ... ... ... ... ...
RDECCFR 49706.0 6.300507 3.883657 0.0 3.00 6.0 10.0 12.0
GDECCFR 22971.0 6.555309 4.101544 0.0 3.00 6.0 12.0 12.0
PDECCFR 26735.0 5.980550 3.711121 0.0 3.00 5.0 9.0 12.0
ADECCFR 49706.0 6.324568 3.880445 0.0 3.00 6.0 10.0 12.0
PONDIH 49706.0 584.899147 1318.860078 0.0 90.25 221.0 464.0 26184.0

172 rows × 8 columns

Esta descripción de las columnas numéricas es únicamente a los fines de obtener una primera idea de la composición del dataset. Sin embargo, se debe tener en cuenta que dentro de las 49706 filas, se encuentran las personas que trabajan y las que no, por lo que posteriormente se realizará un filtro para obtener información solo de aquellas personas laboralmente activas.

image.png

Variables más relevantes para el análisis

Luego de leer detenidamente la documentación, las variables que consideramos más importantes para predecir los ingresos de los trabajadores argentinos son las siguientes:

image.png

image.png

Dentro de las variables seleccionadas, hay algunas que son muy similares entre sí, por ejemplo las relativas al nivel académico (NIVEL_ED y CH12), o algunas relativas a la situación ocupacional (PP04D_COD contiene información similar a CAT_OCUP).

Se realizará el EDA y el modelado en Machine Learning con todas, y luego se decidirá cuál conservar en el modelo definitivo.

EDA¶

Formulación de hipótesis¶

Se elaborarán visualizaciones para intentar comprobar las siguientes hipótesis:

  1. Existe mayor variabilidad en los ingresos de trabajadores independientes que para el caso de los trabajadores asalariados.

  2. Para trabajadores asalariados, se evidencia una diferencia entre el salario percibido por varones y mujeres.

  3. Existe variabilidad entre los salarios percibidos por los trabajadores asalariados, según la región del país, siendo que los que residen en capital y alrededores ganan más dinero que los que viven en el interior del país.

  4. Aquellos trabajadores, ya sean asalariados o independientes, que alcanzaron un mayor nivel educativo, perciben mayores ingresos.

  5. Aquellos trabajadores, ya sean asalariados o independientes, que tienen mayor edad, perciben mayores ingresos.

  6. Sin diferenciar entre trabajadores asalariados o independientes, se evidencia que a mayor cantidad de horas trabajadas, mayores son los ingresos percibidos.

  7. La intensidad de la ocupación de las personas influye en los ingresos percibidos.

  8. Los ingresos de las personas dependen en gran medida de su calificación ocupacional y de la tecnología que emplean en su trabajo.

  9. Las características de la compañía/negocio en el que trabajan las personas influyen en sus ingresos (si es ámbito público o privado, tamaño de la compañía, lugar físico de trabajo, actividad económica a la que se dedica).

Análisis preliminar del campo de ingresos de los trabajadores¶

Queremos indagar sobre la columna que contiene los ingresos de los trabajadores:

In [16]:
df['P21'].describe().round(0)
Out[16]:
count      49706.0
mean       19199.0
std        39613.0
min           -9.0
25%            0.0
50%            0.0
75%        30000.0
max      2000000.0
Name: P21, dtype: float64

Como vemos, el valor mínimo es -9, lo cual, según se aclaró más arriba, representa un código de no respuesta. Por lo tanto, eliminaremos las filas que contengan este valor.

In [17]:
ingresos_sin_especificar=df.P21[df['P21']==-9]
In [18]:
ingresos_sin_especificar.value_counts()
Out[18]:
-9    3465
Name: P21, dtype: int64

Existen 3465 filas con ingresos sin especificar. A continuación, las eliminaremos.

In [19]:
df= df[df['P21']!=-9]
In [20]:
df.shape
Out[20]:
(46241, 177)

Nos queda entonces un dataset reducido a 46241 filas.

Aplicando nuevamente el método describe a la columna de ingresos, solo teniendo en cuenta los valores mayores a cero:

In [21]:
df['P21'][df['P21']>0].describe().round(0)
Out[21]:
count      17212.0
mean       55445.0
std        50220.0
min          200.0
25%        25000.0
50%        45000.0
75%        70000.0
max      2000000.0
Name: P21, dtype: float64
In [22]:
ingresos_de_trabajadores_en_dólares = (55445/116.51) #Se emplea el valor de cambio oficial a Marzo 2022
ingresos_de_trabajadores_en_dólares = round (ingresos_de_trabajadores_en_dólares,0)
print ('Ingreso promedio de los trabajadores argentinos en dólares: U$D '+str(ingresos_de_trabajadores_en_dólares))
Ingreso promedio de los trabajadores argentinos en dólares: U$D 476.0

Notamos que el ingreso promedio de los trabajadores, es $55445 pesos argentinos, unos 476 dólares estadounidenses al valor de cambio oficial de Marzo 2022, mes que corresponde a los salarios del dataset.

Matriz de correlación 1¶

Crearemos un nuevo dataset que contenga solo las variables que consideramos relevantes, a fin de analizar su correlación con la variable objetivo: Ingresos (columna 'P21').

In [23]:
#Creamos un nuevo dataset que contenga solo las variables que consideramos relevantes
df_1=df[['REGION','AGLOMERADO','MAS_500','CH04','CH06','CH12','NIVEL_ED','ESTADO','CAT_OCUP','PP03D','PP3E_TOT','INTENSI','PP04A','PP04B_COD','PP04C','PP04D_COD','PP04G','P21']]
In [24]:
#Analizaremos la correlación entre las variables numéricas
correlations= df_1.corr()
In [25]:
indx=correlations.index

#Para visualizar la matriz de correlación
plt.figure(figsize=(16,12))
sns.heatmap(df_1[indx].corr(),annot=True,cmap="YlGnBu")
Out[25]:
<AxesSubplot:>

A priori, sin ninguna transformación de las columnas del dataframe, la variable que más relación tendría con los ingresos "P21" sería la categoría ocupacional "CAT_OCUP", con un coeficiente de correlación igual a 0,6.

Veremos si esta primera impresión es cierta y qué insights obtendremos a medida que avanzamos en el EDA.

Clasificación de las variables de interés¶

A continuación agruparemos las variables en numéricas o categóricas (y éstas a su vez en ordinales y no ordinales), dado que a posteriori cada una recibirá un tratamiento diferente.

Variables numéricas:

  • Edad de los trabajadores (sin dividir en rangos): 'CH06'
  • Cantidad de horas semanales trabajadas: 'PP3E_TOT'
  • Ingresos todos los trabajadores: 'P21'-- Variable objetivo

Variables categóricas no ordinales:

  • Sexo: 'CH04'
  • Región del país en la que reside: 'REGION'
  • Aglomerado urbano en el que reside: 'AGLOMERADO'
  • Tamaño del aglomerado: 'MAS_500'
  • Código de ocupación (acerca de las profesiones): 'PP04D_COD'
  • Tipo de empresa (estatal/privada/otro): 'PP04A'
  • Actividad económica de la empresa: 'PP04B_COD'
  • Lugar físico de trabajo: 'PP04G'
  • Edad (discretizada en rangos): 'Edad_2' -- Variable a crear

Variables categóricas ordinales:

  • Nivel educativo 1: 'NIVEL_ED'
  • Nivel educativo 2: 'CH12'
  • Condición de actividad: 'ESTADO'
  • Categoría ocupacional: 'CAT_OCUP'
  • Intensidad de la ocupación: 'INTENSI'
  • Tamaño de la empresa: 'PP04C'
  • Cantidad de ocupaciones: 'PP03D' (Si bien es numérica, conviene analizarla como categórica).

Data wranling¶

Como parte de la etapa de limpieza y preparación de los datos para poder consumirlos en visualizaciones, lo primero que haremos será tratar los valores nulos.

In [26]:
df_1.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 46241 entries, 0 to 49705
Data columns (total 18 columns):
 #   Column      Non-Null Count  Dtype  
---  ------      --------------  -----  
 0   REGION      46241 non-null  int64  
 1   AGLOMERADO  46241 non-null  int64  
 2   MAS_500     46241 non-null  object 
 3   CH04        46241 non-null  int64  
 4   CH06        46241 non-null  int64  
 5   CH12        46241 non-null  int64  
 6   NIVEL_ED    46241 non-null  int64  
 7   ESTADO      46241 non-null  int64  
 8   CAT_OCUP    46241 non-null  int64  
 9   PP03D       17756 non-null  float64
 10  PP3E_TOT    17756 non-null  float64
 11  INTENSI     17756 non-null  float64
 12  PP04A       17756 non-null  float64
 13  PP04B_COD   17756 non-null  float64
 14  PP04C       17756 non-null  float64
 15  PP04D_COD   17756 non-null  float64
 16  PP04G       17756 non-null  float64
 17  P21         46241 non-null  int64  
dtypes: float64(8), int64(9), object(1)
memory usage: 6.7+ MB

Dado que desdela columna de índice 9 hasta la 16 tenemos valores nulos, reemplazaremos los mismos por ceros.

In [27]:
df_1.fillna(0,inplace=True)
df_1.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 46241 entries, 0 to 49705
Data columns (total 18 columns):
 #   Column      Non-Null Count  Dtype  
---  ------      --------------  -----  
 0   REGION      46241 non-null  int64  
 1   AGLOMERADO  46241 non-null  int64  
 2   MAS_500     46241 non-null  object 
 3   CH04        46241 non-null  int64  
 4   CH06        46241 non-null  int64  
 5   CH12        46241 non-null  int64  
 6   NIVEL_ED    46241 non-null  int64  
 7   ESTADO      46241 non-null  int64  
 8   CAT_OCUP    46241 non-null  int64  
 9   PP03D       46241 non-null  float64
 10  PP3E_TOT    46241 non-null  float64
 11  INTENSI     46241 non-null  float64
 12  PP04A       46241 non-null  float64
 13  PP04B_COD   46241 non-null  float64
 14  PP04C       46241 non-null  float64
 15  PP04D_COD   46241 non-null  float64
 16  PP04G       46241 non-null  float64
 17  P21         46241 non-null  int64  
dtypes: float64(8), int64(9), object(1)
memory usage: 6.7+ MB

Variables numéricas¶

Renombraremos las columnas de variables numéricas para facilitar la interpretación de los gráficos.

In [28]:
df_1 = df_1.rename(columns={'CH06':'Edad','PP3E_TOT': 'Horas_sem', 'P21':'Ingresos', 'PP03D':'Cant_Ocup'})

Variables categóricas¶

También tenombraremos algunas de las columnas de variables numéricas para facilitar la interpretación de las visualizaciones.

In [29]:
df_1 = df_1.rename(columns={'CH04':'Sexo','PP04D_COD':'Cod_Ocup','PP04A':'Tipo_empr','PP04B_COD': 'Cod_activ','PP04G':'Lugar_trab','CH12':'NIVEL_ED_2','PP04C':'Tamaño_empr'})

A continuación iremos traduciendo los códigos de las categorías a su significado en texto.

Dejaremos para el final las variables 'Cod_Ocup' y 'Cod_activ' ya que requieren transformaciones adicionales.

1) Sexo¶

image.png

In [30]:
#Traducimos el código de la columna Sexo a su significado
df_1['Sexo']=df_1['Sexo'].replace({2:'F',1:'M'})

2) Región¶

image.png

In [31]:
#Traducimos el código de la columna Región a su significado
df_1['REGION']=df_1['REGION'].replace({1:'GBA', 40:'NOA', 41:'NEA',
                                          42:'CUYO', 43:'PAMPEANA', 44:'PATAGONIA'})

3) Aglomerado¶

image.png

In [32]:
#Verificaremos que no exista el valor 99 correspondiente a la categoría Ns/Nr.
df_1['AGLOMERADO'].value_counts()
Out[32]:
33    4340
23    2301
13    2150
29    2062
10    1999
27    1861
19    1781
4     1710
18    1677
22    1476
25    1428
32    1407
38    1404
14    1350
15    1347
26    1296
8     1292
91    1242
5     1215
36    1191
7     1140
6     1123
17    1096
12    1073
93    1036
31    1035
9      983
2      962
3      860
34     823
20     819
30     762
Name: AGLOMERADO, dtype: int64

Dado que no hay valores de código de No Respuesta, por el momento dejaremos los valores de la columna Aglomerado tal como están. La única transformación que haremos será convertir la variable de tipo número decimal a número entero.

In [33]:
#Convertimos la columna 'AGLOMERADO'a tipo entero.
df_1['AGLOMERADO']=df_1['AGLOMERADO'].astype(int)

4) Tamaño del aglomerado¶

image.png

Por el momento no se requiere transformación de esta variable.

5) Tipo de empresa¶

image.png

In [34]:
#Traducimos el código de la columna Tipo_empr a su significado
df_1['Tipo_empr']=df_1['Tipo_empr'].replace({1:'Estatal', 2:'Privada', 3:'Otro'})

6) Lugar físico de trabajo¶

image.png

image.png

In [35]:
#Verificaremos que no exista el valor 99 correspondiente a la categoría Ns/Nr.
df_1['Lugar_trab'].value_counts()
Out[35]:
0.0     29721
1.0     10830
8.0      1823
6.0      1236
4.0       809
9.0       756
5.0       628
10.0      217
3.0       115
7.0        68
2.0        38
Name: Lugar_trab, dtype: int64

Dado que no hay valores de código de No Respuesta, por el momento dejaremos los valores de la columna 'Lugar físico de trabajo' tal como están. La única transformación que haremos será convertir la variable de tipo número decimal a número entero.

In [36]:
#Convertimos la columna 'Lugar_trab'a tipo entero.
df_1['Lugar_trab']=df_1['Lugar_trab'].astype(int)

7) Nivel educativo 1¶

image.png

In [37]:
#Traducimos el código de la columna Nivel educativo a su significado
df_1['NIVEL_ED']=df_1['NIVEL_ED'].replace({9:'0-Ns/Nr',1:'1-Primario incompl',2:'2-Primario compl',
                                           3:'3-Secundario incompl',4:'4-Secundario compl',5:'5-Superior universit incompl',
                                           6:'6-Superior universit compl',7:'0-Sin instrucción'})

8) Nivel educativo 2¶

image.png

In [38]:
#Traducimos el código de la columna Nivel educativo a su significado
df_1['NIVEL_ED_2']=df_1['NIVEL_ED_2'].replace({9:'0-Educación especial', 0:'1-Jardín/preescolar/Sin instr.', 1:'1-Jardín/preescolar/Sin instr.',2:'2-Primario',
                                           3:'3-EGB',4:'4-Secundario',5:'5-Polimodal',
                                           6:'6-Terciario',7:'7-Universitario',8:'8-Posgrado universitario'})
In [39]:
df_1['NIVEL_ED_2'].value_counts()
Out[39]:
4-Secundario                      18022
2-Primario                        11659
7-Universitario                    6560
6-Terciario                        4383
1-Jardín/preescolar/Sin instr.     3629
3-EGB                               855
5-Polimodal                         586
0-Educación especial                274
8-Posgrado universitario            269
99                                    4
Name: NIVEL_ED_2, dtype: int64

9) Condición de actividad¶

image.png

In [40]:
#Reemplazamos los valores 2, 3 y 4 por cero, ya que no perciben ingresos
df_1['ESTADO']=df_1['ESTADO'].replace({2:0, 3:0, 4:0})

10) Categoría ocupacional¶

image.png

In [41]:
#Traducimos el código de la columna Categoría Ocupacional a su significado
df_1['CAT_OCUP']=df_1['CAT_OCUP'].replace({9:'Ns/Nr',1:'Patron',2:'Cuenta Propia',3:'Obrero o empleado',4:'Trabajador fliar s/remun'})

11) Intensidad de la ocupación¶

image.png

In [42]:
#Traducimos el código de la columna Intensidad de ocupación a su significado
df_1['INTENSI']=df_1['INTENSI'].replace({9:'Ns/Nr',1:'1-Subocupado',2:'2-Ocupado pleno',3:'3-Sobreocupado',4:'2-Ocup q no trabajó ult sem'})

12) Tamaño de la empresa 1¶

image.png

In [43]:
#Traducimos el código de la columna Tamaño_empr a su significado
df_1['Tamaño_empr']=df_1['Tamaño_empr'].replace({1:'1) 1 pers',2:'2) 2 pers',3:'3) 3 pers',4:'4) 4 pers',5:'5) 5 pers',
                                         6:'6) 6 a 10 pers', 7:'7) 11 a 25 pers', 8:'8) 26 a 40 pers', 9:'9) 41 a 100 pers',
                                         10:'10) 101 a 200 pers', 11:'11) 201 a 500 pers', 12:'12) Más de 500 pers', 0:'0) Ns/Nr', 99:'0) Ns/Nr'})

12.1) Tamaño de la empresa 2¶

A los fines de reducir la cantidad de categorías de la variable Tamaño_empr, crearemos una nueva variable Tamaño_empr_2, donde clasificaremos las empresas según el siguiente criterio:

1) Microempresas: cuando tengan una planta de personal de hasta CINCO (5) trabajadores;

2) Pequeñas: cuando tengan una planta de personal de SEIS (6) a CUARENTA (40) trabajadores;

3) Medianas: cuando tengan una planta de personal de CUARENTA Y UN (41) a DOSCIENTOS (200) trabajadores;

4) Grandes: cuando tengan una planta de personal de más de DOSCIENTOS (200) trabajadores.

In [44]:
#Creamos la columna Tamaño_empr_2 a partir de la columna Tamaño_empr
df_1['Tamaño_empr_2']=df_1['Tamaño_empr'].replace({'1) 1 pers': 'Microempresa','2) 2 pers':'Microempresa','3) 3 pers':'Microempresa','4) 4 pers':'Microempresa',
                                                  '5) 5 pers':'Microempresa', '6) 6 a 10 pers':'Pequeña',
                                         '7) 11 a 25 pers':'Pequeña', '8) 26 a 40 pers':'Pequeña', '9) 41 a 100 pers':'Mediana',
                                         '10) 101 a 200 pers':'Mediana','11) 201 a 500 pers':'Grande', '12) Más de 500 pers':'Grande'})

13) Código de ocupación (acerca de las profesiones y jerarquías)¶

Esta variable, originariamente denominada 'PP04D_COD', y que renombramos como 'Cod_Ocup', es un código compuesto por 5 dígitos, donde cada dígito representa una categoría en sí, tal como se muestra debajo.

Plantearemos el análisis de dos formas:

Por un lado, conservando la columna tal y como está, teniendo presente que, a pesar de que sus datos son números, se trata de una variable categórica.

Por otro lado, crearemos cuatro nuevas variables a partir de la variable original.

image.png

In [45]:
df_1['Cod_Ocup'].value_counts()
Out[45]:
0.0        28485
10333.0     1263
55314.0      900
56314.0      744
30113.0      716
           ...  
34203.0        1
40319.0        1
20311.0        1
46112.0        1
62312.0        1
Name: Cod_Ocup, Length: 435, dtype: int64

Vemos que, hay 28485 personas que no completaron este campo.

13.0) Creación de nuevas variables a partir de 'Cod_Ocup'.¶
In [46]:
#Convertimos la columna 'Cod_Ocup'a tipo entero.
df_1['Cod_Ocup']=df_1['Cod_Ocup'].astype(int)

#Creamos una columna igual a 'Cod_Ocup' pero de tipo string 
df_1['ocupacion']=df_1['Cod_Ocup'].astype(str)

ocupacion= df_1['ocupacion']
In [47]:
#Creamos la columna Caracter Ocupacional
#Extraemos los dos primeros caracteres y los volcamos en una nueva columna del dataset
df_1['CARACTER_OCUP']=ocupacion.str[-5:-3]
In [48]:
#Creamos la columna Jerarquia Ocupacional
#Extraemos el tercer caracter y lo volcamos en una nueva columna del dataset
df_1['JERARQUIA_OCUP']=ocupacion.str[-3:-2]
In [49]:
#Creamos la columna Tecnologia Ocupacional
#Extraemos el cuarto caracter y lo volcamos en una nueva columna del dataset
df_1['TECNOLOGIA_OCUP']=ocupacion.str[-2:-1]
In [50]:
#Creamos la columna Calificacion Ocupacional
#Extraemos el quinto caracter y lo volcamos en una nueva columna del dataset
df_1['CALIFICACION_OCUP']=ocupacion.str[-1:]
13.1) Carácter ocupacional¶

image.png

image.png

image.png

image.png

image.png

image.png

In [51]:
df_1['CARACTER_OCUP'].nunique()
Out[51]:
52

En nuestro dataset hay 52 categorías diferentes para la variable Carácter ocupacional. Las analizaremos más adelante.

13.2) Jerarquía ocupacional¶

image.png

In [52]:
#Traducimos el código de la columna Jerarquía ocupacional a su significado
df_1['JERARQUIA_OCUP']=df_1['JERARQUIA_OCUP'].replace({'0':'Direccion','1':'Cuenta propia','2':'Jefes','3':'Asalariados','9':'Ns/Nr'})
In [53]:
#Verificamos que los valores se hayan reemplazado correctamente
df_1['JERARQUIA_OCUP'].value_counts()
Out[53]:
                 28487
Asalariados      12831
Cuenta propia     3723
Direccion          736
Jefes              383
Ns/Nr               81
Name: JERARQUIA_OCUP, dtype: int64
13.3) Tecnología ocupacional¶

image.png

In [54]:
df_1['TECNOLOGIA_OCUP'].nunique()
Out[54]:
6

Nos figuran 6 categorías, en lugar de 3. Veamos por qué:

In [55]:
df_1['TECNOLOGIA_OCUP'].value_counts()
Out[55]:
     28487
1    10400
3     4612
2     1542
0     1119
9       81
Name: TECNOLOGIA_OCUP, dtype: int64
In [56]:
#Traducimos el código de la columna Tecnología ocupacional a su significado
df_1['TECNOLOGIA_OCUP']=df_1['TECNOLOGIA_OCUP'].replace({'0':'Ns/Nr','9':'Ns/Nr','1':'Sin op. máq.','2':'Con op. máq.','3':'Op. sist. informatizados'})
In [57]:
#Reemplazamos todas las celdas vacías de las columnas nuevas por ceros
df_1['CARACTER_OCUP']=df_1['CARACTER_OCUP'].replace({'':0})
df_1['JERARQUIA_OCUP']=df_1['JERARQUIA_OCUP'].replace({'':0})
df_1['TECNOLOGIA_OCUP']=df_1['TECNOLOGIA_OCUP'].replace({'':0})
13.4) Calificación ocupacional¶

image.png

In [58]:
df_1['CALIFICACION_OCUP'].value_counts()
Out[58]:
0    28485
3     9299
4     3807
2     3175
1     1390
7       71
9        8
8        6
Name: CALIFICACION_OCUP, dtype: int64
In [59]:
#Traducimos el código de la columna Calificación ocupacional a su significado
df_1['CALIFICACION_OCUP']=df_1['CALIFICACION_OCUP'].replace({'1':'Profesionales','2':'Técnicos','3':'Operativo','4':'No calificado', '7':'Ns/Nr','8':'Ns/Nr','9':'Ns/Nr'})
In [60]:
#Eliminamos la columna auxiliar "ocupacion", anteriormente creada
df_1=df_1.drop(['ocupacion'], axis=1)

14) Código de actividad económica¶

Esta variable, originariamente denominada 'PP04B_COD', y que renombramos como 'Cod_activ', es un código compuesto por 4 dígitos, donde los dos primeros dígitos indican una categoría y los dos siguientes una subcategoría.

Para el análisis, extraeremos los dos primeros dígitos y crearemos una nueva variable a denominar 'ACTIV_ECON'. Para evitar un nivel de detalle excesivo, los siguientes dos dígitos se ignorarán.

Luego, agruparemos las categorías de la columna 'ACTIV_ECON' en otras más globales, y volcaremos esta información en una nueva variable a denominar 'CAT_ECON' (identificada por un código alfabético).

En la tabla seguiente se detallan las actividades económicas que emplea la encuesta:

image.png

14.0) Creación de nuevas variables a partir de 'Cod_activ'.¶
In [61]:
#Queremos extraer las dos primeras cifras del código Cod_activ, por lo que primero lo convertimos a entero 
df_1['Cod_activ']= df_1['Cod_activ'].astype(int)
In [62]:
#Reemplazamos el código 9999 (que significa actividad vacía o mal especificada) por cero
df_1['Cod_activ']=df_1['Cod_activ'].replace({9999:0})
In [63]:
#Convertimos la variable actividad_economica a tipo string, para extraer los dos dígitos de la izquierda
actividad_economica= df_1['Cod_activ'].astype(str)
14.1) Creación de la variable Actividad Económica¶
In [64]:
#Creación de la variable 'ACTIV_ECON'
#Para ello extraemos los dos primeros caracteres y los volcamos en una nueva columna del dataset
df_1['ACTIV_ECON']=actividad_economica.str[:2]
In [65]:
df_1['ACTIV_ECON'].value_counts()
Out[65]:
0     28639
48     2799
84     2298
40     1792
85     1541
      ...  
50        5
39        1
99        1
12        1
37        1
Name: ACTIV_ECON, Length: 78, dtype: int64
14.2) Creación de la variable Categoría Económica¶
In [66]:
#Asignaremos un código alfabético a cada grupo de actividades económicas, según la tabla del punto 14)
df_1['CAT_ECON']=df_1['ACTIV_ECON'].replace({
                                            '1':'A','01':'A','02':'A','03':'A',
                                            '05':'B','06':'B','07':'B','08':'B','09':'B',
                                            '10':'C','11':'C','12':'C','13':'C',
                                            '14':'C','15':'C','16':'C','17':'C','18':'C',
                                            '19':'C','20':'C','21':'C','22':'C','23':'C',
                                            '24':'C','25':'C','26':'C','27':'C','28':'C',
                                            '29':'C','30':'C','31':'C','32':'C','33':'C',
                                            '35':'D',
                                            '36':'E','37':'E','38':'E','39':'E',
                                            '40':'F',
                                            '45':'G','48':'G', 
                                            '49':'H','50':'H','51':'H','52':'H','53':'H',
                                            '55':'I','56':'I', 
                                            '58':'J','59':'J','60':'J','61':'J','62':'J','63':'J',
                                            '64':'K','65':'K','66':'K',
                                            '68':'L',
                                            '69':'M','70':'M','71':'M','72':'M','73':'M','74':'M','75':'M',
                                            '77':'N','78':'N','79':'N','80':'N','81':'N','82':'N',
                                            '83':'O','84':'O', 
                                            '85':'P',
                                            '86':'Q','87':'Q','88':'Q',
                                            '90':'R','91':'R','92':'R','93':'R',
                                            '94':'S','95':'S','96':'S',
                                            '97':'T','98':'T',
                                            '99':'U',
                                            '0':'W','00':'W'
                                             })

Veremos cómo quedaron nuestras nuevas variables:

In [67]:
df_1.head()
Out[67]:
REGION AGLOMERADO MAS_500 Sexo Edad NIVEL_ED_2 NIVEL_ED ESTADO CAT_OCUP Cant_Ocup ... Cod_Ocup Lugar_trab Ingresos Tamaño_empr_2 CARACTER_OCUP JERARQUIA_OCUP TECNOLOGIA_OCUP CALIFICACION_OCUP ACTIV_ECON CAT_ECON
0 PAMPEANA 14 N F 42 6-Terciario 6-Superior universit compl 1 Obrero o empleado 0.0 ... 48311 1 150000 Pequeña 48 Asalariados Sin op. máq. Profesionales 84 O
1 PAMPEANA 14 N F 19 6-Terciario 5-Superior universit incompl 1 Obrero o empleado 0.0 ... 30333 1 10000 Microempresa 30 Asalariados Op. sist. informatizados Operativo 48 G
2 PAMPEANA 14 N M 13 4-Secundario 3-Secundario incompl 0 0 0.0 ... 0 0 0 0) Ns/Nr 0 0 0 0 0 W
4 GBA 33 S M 68 4-Secundario 4-Secundario compl 0 0 0.0 ... 0 0 0 0) Ns/Nr 0 0 0 0 0 W
6 PAMPEANA 30 N M 52 6-Terciario 6-Superior universit compl 1 Obrero o empleado 0.0 ... 41332 1 80000 0) Ns/Nr 41 Asalariados Op. sist. informatizados Técnicos 85 P

5 rows × 25 columns

15) Cantidad de ocupaciones¶

In [68]:
df_1['Cant_Ocup'].value_counts()
Out[68]:
0.0    44748
2.0     1299
3.0      130
4.0       31
5.0       22
6.0        6
7.0        3
9.0        2
Name: Cant_Ocup, dtype: int64

Asumiremos, por lo que indica la documentación, que el valor 9 corresponde a la categoría No sabe/No responde, y lo reemplazaremos por cero.

In [69]:
df_1['Cant_Ocup']=df_1['Cant_Ocup'].replace({9:0})

Notar que está ausente el valor 1, es decir que el campo quedó en valor cero para aquellas personas que tienen una sola ocupación. Le asignaremos el valor 1 a quienes tengan valor cero pero que perciben ingresos por trabajar.

In [70]:
#Reemplazamos por 1 aquellas personas que perciben ingresos por trabajar pero que no completaron el campo cantidad de ocupaciones.
df_1['Cant_Ocup'] = np.where((df_1['Cant_Ocup'] <2 ) & (df_1['Ingresos'] > 0), 1, df_1['Cant_Ocup'])
In [71]:
#Verificamos que los valores se hayan reemplazado correctamente
df_1['Cant_Ocup'].value_counts()
Out[71]:
0.0    28989
1.0    15761
2.0     1299
3.0      130
4.0       31
5.0       22
6.0        6
7.0        3
Name: Cant_Ocup, dtype: int64
In [72]:
#Convertimos la columna a tipo número entero 
df_1['Cant_Ocup']=df_1['Cant_Ocup'].astype(int)

Copia del dataset¶

Antes de proseguir con el EDA, guardamos una copia del dataframe tal como está ahora, la cual necesitaremos más adelante, en la etapa de modelado en Machine Learning.

In [73]:
df_1a= df_1.copy()

16) Edad (en rangos)¶

A fin de analizar los valores de ingresos de la población económicamente activa, tomaremos la porción que tiene entre 18 años, valor que representa la mayoría de edad, y 65 años, edad jubilatoria para los hombres (en el caso de las mujeres es 60).

In [74]:
#Filtramos por edad para conservar las personas de 18 años o más
df_1=df_1[df_1['Edad']>=18]
In [75]:
#Filtramos por edad para conservar las personas de 65 años o menos
df_1=df_1[df_1['Edad']<=65]

Crearemos una nueva variable categórica no ordinal, a partir de la columna 'Edad', para tener las edades diferenciadas en rangos.

In [76]:
#Creamos una nueva columna "Edad_2" que agrupa a las personas por rango de edad
df_1['Edad_2'] = pd.qcut(df_1['Edad'], 6) 
df_1['Edad_2'].value_counts()
Out[76]:
(17.999, 24.0]    5268
(38.0, 46.0]      4704
(30.0, 38.0]      4699
(55.0, 65.0]      4526
(46.0, 55.0]      4328
(24.0, 30.0]      4161
Name: Edad_2, dtype: int64

Análisis univariado¶

Ahora que ya realizamos la limpieza de los datos y los preparamos para las visualizaciones, procederemos con el análisis univariado.

Éste consiste en el análisis de cada una de las variables estudiadas por separado, se basa exclusivamente en una única variable. Es el análisis más básico y por ende más primario. Se considera un análisis de tipo descriptivo y no relacional o de causalidad. Las técnicas más frecuentes de análisis univariado son la distribución de frecuencias para una tabla univariada y el análisis de las medidas de tendencia central de la variable (media, mediana, moda, varianza , desviación estándar, cuartiles entre otros).

Diferenciaremos el análisis según se trate de variables numéricas o categóricas.

Variables numéricas¶

In [77]:
#Agruparemos en una variable 'var_num' todas las features numéricas
var_num = ['Edad','Horas_sem','Ingresos']

Antes de graficar, eliminaremos dos outliers en particular que ensucian las visualizaciones:

  • Un valor de ingresos igual a $2 millones (si bien puede ser un salario real, lo eliminaremos)
  • Un valor de Horas semanales trabajadas igual a 999 (se trata de un valor mal especificado).

Además, para las visualizaciones tendremos en cuenta solo aquellas personas con ingresos mayores a cero, es decir, aquellas que trabajan.

In [78]:
df_1=df_1[df_1['Ingresos']!=2000000]
df_1=df_1[df_1['Horas_sem']!=999]
df_1=df_1[df_1['Ingresos']>0]

A continuación realizaremos un gráfico de distribución y un boxplot para cada una de las variables numéricas.

In [79]:
#Creamos la función
def plot_num_vars(df_1,colname):

  fig, axes = plt.subplots(1,2, figsize=(20,8))

  #grafico de distribucion
  plot00 = sns.distplot(df_1[colname], ax=axes[0])
  axes[0].set_title('Distribution of  {name}'.format(name=colname))

  #boxplot
  plot01 = sns.boxplot(df_1[colname], ax=axes[1])
  axes[1].set_title('Boxplot of {nombre}'.format(nombre=colname))

  plt.show()
In [80]:
#Graficamos
for i in var_num:
  plot_num_vars(df_1,i)

A continuación realizaremos diagramas de boxplot para las variables Edad y Horas semanales trabajadas, diferenciando por sexo y por categoría ocupacional (recordemos que esta última categoría, en principio, es la que mayor correlación tiene con los ingresos).

1) Edad¶

In [81]:
fig = px.box(df_1, 
                 x="Edad",          
                 color= 'Sexo', 
                 title= 'Edad según sexo'
                 )
fig.update_layout(
    height = 500,
    width = 500,
    title={
        'xanchor': 'center',
        'yanchor': 'top',
        'y':0.97,
        'x':0.5,
        },
        xaxis_title = 'Edad de los trabajadores'
    )
fig.show()
In [82]:
df_1.groupby(['Sexo']).agg({'Edad':['mean','median']}).reset_index()
Out[82]:
Sexo Edad
mean median
0 F 40.382077 40.0
1 M 39.749946 39.0
In [83]:
fig = px.box(df_1, 
                 x="Edad",          
                 color= 'CAT_OCUP', 
                 title= 'Edad según categoría ocupacional'
                 )
fig.update_layout(
    height = 500,
    width = 800,
    title={
        'xanchor': 'center',
        'yanchor': 'top',
        'y':0.97,
        'x':0.5,
        },
        xaxis_title = 'Edad de los trabajadores'
    )
fig.show()
In [84]:
df_1.groupby(['CAT_OCUP']).agg({'Edad':['mean','median']}).reset_index()
Out[84]:
CAT_OCUP Edad
mean median
0 Cuenta Propia 42.196154 42.0
1 Obrero o empleado 39.283410 39.0
2 Patron 44.910828 46.0

2) Horas semanales¶

In [85]:
fig = px.box(df_1, 
                 x="Horas_sem",          
                 color= 'Sexo', 
                 title= 'Horas semanales trabajadas según sexo'
                 )
fig.update_layout(
    height = 500,
    width = 700,
    title={
        'xanchor': 'center',
        'yanchor': 'top',
        'y':0.97,
        'x':0.5,
        },
        xaxis_title = 'Horas semanales trabajadas'
    )
fig.show()
In [86]:
df_1.groupby(['Sexo']).agg({'Horas_sem':['mean','median']}).reset_index()
Out[86]:
Sexo Horas_sem
mean median
0 F 28.087202 30.0
1 M 38.132394 40.0

Analizando la distribución del dataset según sexo, se observa que, en promedio, los hombres trabajan, por semana, 10 horas más que las mujeres (Se debe tener presente que estamos analizando el trabajo remunerado. No se incluyen las horas de trabajo no remunerado relativo a tareas domésticas o de cuidado de familiares).

In [87]:
fig = px.box(df_1, 
                 x="Horas_sem",           
                 color= 'CAT_OCUP', 
                 title= 'Horas semanales trabajadas según categoría ocupacional'
                 )
fig.update_layout(
    height = 500,
    width = 800,
    title={
        'xanchor': 'center',
        'yanchor': 'top',
        'y':0.97,
        'x':0.5,
        },
        xaxis_title = 'Horas semanales trabajadas'
    )
fig.show()
In [88]:
df_1.groupby(['CAT_OCUP']).agg({'Horas_sem':['mean','median']}).reset_index()
Out[88]:
CAT_OCUP Horas_sem
mean median
0 Cuenta Propia 35.495858 36.0
1 Obrero o empleado 32.799657 36.0
2 Patron 43.375796 44.0

No se detectan diferencias significativas entre asalariados e independientes (no patrones), en cuanto a la cantidad de horas trabajadas, siendo que en promedio ambas categorías trabajan alrededor de 36 horas a la semana. Por su parte, los patrones trabajan 8 horas más a la semana que las categorías anteriores.

3) Ingresos¶

Dado que se trata de la variable objetivo, esta variable será caracterizada en el análisis bivariado, contrastándola con cada una de las demás variables. Sin embargo, aquí adelantamos sus principales parámetros estadísticos para el rango de edades definido para el EDA.

In [89]:
df_1['Ingresos'].describe().round(0)
Out[89]:
count      16684.0
mean       55563.0
std        47893.0
min          200.0
25%        26000.0
50%        45000.0
75%        70000.0
max      1000000.0
Name: Ingresos, dtype: float64
In [90]:
ingresos_de_trabajadores_en_dólares = (55563/116.51) #Se emplea el valor de cambio oficial a Marzo 2022
ingresos_de_trabajadores_en_dólares = round (ingresos_de_trabajadores_en_dólares,0)
print ('Ingreso promedio de los trabajadores argentinos en dólares (18 a 65 años): U$D '+str(ingresos_de_trabajadores_en_dólares))
Ingreso promedio de los trabajadores argentinos en dólares (18 a 65 años): U$D 477.0

Insights¶

Interpretación de los gráficos univariados- variables numéricas:

  • Horas semanales trabajadas:

Analizando la distribución del dataset según sexo, se observa que, en promedio, los hombres trabajan 10 horas más por semana que las mujeres (Se debe tener presente que estamos analizando el trabajo remunerado. No se incluyen las horas de trabajo no remunerado relativo a tareas domésticas o de cuidado de familiares).

No se detectan diferencias significativas entre trabajadores por cuenta propia y empleados, en cuanto a la cantidad de horas trabajadas, siendo el valor medio para ambas categorías igual a 36 horas a la semana.

  • Edad de los trabajadores: La edad promedio de los trabajadores es 40 años, no existiendo casi diferencia entre varones y mujeres.

Quienes son patrones trabajan en promedio más horas a la semana que el resto de los trabajadores (44 horas, lo que se podría interpretar como una jornada laboral de 8 hs de lunes a viernes y 4 hs los sábados), y su edad media también es mayor (46 años).

  • Ingresos:

El ingreso promedio de los trabajadores, para el rango de edades de 18 a 65 años, es $55563 pesos argentinos, unos 477 dólares estadounidenses al valor de cambio oficial de Marzo 2022, mes que corresponde a los salarios del dataset.

Si tomamos el valor medio, que no se ve afectado por los outliers, el ingreso de los trabajadores argentinos desciende a $45mil (386 dólares estadounidenses).

El 75% de los trabajadores argentinos gana $70mil (600 dólares estadounidenses) o menos.

Variables categóricas¶

1) Sexo¶

In [91]:
fig = px.pie(df_1, names='Sexo', title='Composición de los trabajadores según sexo', color_discrete_sequence=['#C42021','#6C0E23'])
fig.update_traces(textposition='inside', textinfo='percent+label')
fig.update_layout(
    height = 500,
    width = 500,
    title={
        'xanchor': 'center',
        'yanchor': 'top',
        'y':0.95,
        'x':0.5},
    showlegend=False,
    xaxis = {'categoryorder':'category ascending'}
        )
fig.show()

Vemos que en cuestión de sexo el dataset se encuentra bastante balanceado, siendo que las mujeres representar casi el 45% de los trabajadores.

2) Región¶

In [92]:
df_1['REGION'].value_counts(normalize=True)
Out[92]:
PAMPEANA     0.270379
NOA          0.255334
PATAGONIA    0.142412
CUYO         0.117538
GBA          0.110705
NEA          0.103632
Name: REGION, dtype: float64
In [93]:
fig= px.histogram(df_1,  
                  y='REGION', 
                  text_auto='.2s',
                  color_discrete_sequence=['#542344'])
fig.update_traces(
    textposition = 'outside',
)
fig.update_layout(
    height = 700,
    width = 900,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Región de residencia',
    xaxis_title = 'Recuento de personas')
fig.show()

A continuación se visualiza la composición del dataset según región del país, expresada en porcentaje, en el siguiente mapa elaborado en QGIS:

image.png

Vemos que la mayor cantidad de personas del dataset viven en la región Pampeana, que incluye las provincias de La Pampa, Córdoba, Santa Fe, Entre Ríos y Buenos Aires, descontando la zona de Gran Buenos Aires (que abarca la Ciudad Autónoma de Buenos Aires y alrededores).

3) Aglomerado y 4) Tamaño del aglomerado¶

Convertiremos la columna 'AGLOMERADO' de tipo número entero a tipo texto, para facilitar su interpretación en el gráfico.

In [94]:
#Convertimos la columna 'AGLOMERADO'a tipo string 
df_1['AGLOMERADO']=df_1['AGLOMERADO'].astype(str)
In [95]:
#Verificamos que se haya convertido correctamente
df_1['AGLOMERADO'].value_counts()
Out[95]:
33    1305
23     927
13     844
19     757
10     752
29     742
27     728
18     663
4      623
25     593
22     578
32     542
91     495
36     486
26     481
8      472
38     431
31     428
15     428
12     419
17     413
7      410
5      389
14     388
9      375
93     358
6      340
20     307
30     264
34     262
2      254
3      230
Name: AGLOMERADO, dtype: int64

Graficamos, ordenando los aglomerados de mayor a menor, en función de la cantidad de encuestados:

In [96]:
fig= px.histogram(df_1,  
                  y='AGLOMERADO', 
                  text_auto='.2s',
                  color='MAS_500')
                  

fig.update_layout(
    height = 800,
    width = 1000,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Aglomerado urbano de residencia',
    xaxis_title = 'Recuento de personas')
fig.show()

El aglomerado que cuenta con mayor cantidad de trabajadores encuestados es el 33=Partidos del GBA, perteneciente a la región del Gran Buenos Aires, con 1305 personas, seguido por el 23=Gran Salta, de la región del NOA (Noroeste Argentino), con 930 encuestados.

Por otro lado, el aglomerado con menor cantidad de encuestados es 03=Bahía Blanca- Cerri, de la región Pampeana, con 230 personas.

En el gráfico se diferencian los aglomerados por color, en función de si tienen más de 500 mil habitantes o no.

5) Tipo de empresa¶

In [97]:
df_1['Tipo_empr'].value_counts()
Out[97]:
Privada    12164
Estatal     4268
Otro         252
Name: Tipo_empr, dtype: int64
In [98]:
fig = px.pie(df_1, names='Tipo_empr', title='Composición de los trabajadores según tipo de empresa/negocio', color_discrete_sequence=['#C42021','#6C0E23','#D9828B'])
fig.update_traces(textposition='inside', textinfo='percent+label')
fig.update_layout(
    height = 500,
    width = 700,
    title={
        'xanchor': 'center',
        'yanchor': 'top',
        'y':0.95,
        'x':0.5},
    showlegend=False,
    xaxis = {'categoryorder':'category ascending'}
        )
fig.show()

Vemos que casi 3 de cada 4 trabajadores encuestados trabajan en empresas de ámbito privado.

6) Lugar físico de trabajo¶

Convertiremos la columna 'Lugar_trab' de tipo número entero a tipo texto, para facilitar su interpretación en el gráfico.

In [99]:
#Convertimos la columna 'Lugar_trab'a tipo entero.
df_1['Lugar_trab']=df_1['Lugar_trab'].astype(str)
In [100]:
df_1['Lugar_trab'].value_counts(normalize=True)
Out[100]:
1     0.614301
8     0.102074
0     0.069947
6     0.065931
4     0.045553
9     0.043215
5     0.035004
10    0.012047
3     0.006593
7     0.003476
2     0.001858
Name: Lugar_trab, dtype: float64
In [101]:
fig= px.histogram(df_1,  
                  y='Lugar_trab', 
                  text_auto='.2s',
                  color_discrete_sequence=['#542344'])
                  
fig.update_layout(
    height = 800,
    width = 1000,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Lugar físico de trabajo',
    xaxis_title = 'Recuento de personas')
fig.show()

Del gráfico observamos que el 60% de los trabajadores del dataset tienen como lugar físico de trabajo la categoría 1="Local/oficina/establecimiento/negocio/taller/chacra/finca", seguido por las categorías 8="Domicilio/local de los clientes" (10% de los trabajadores) y categoría 6="En su vivienda- sin lugar exclusivo" (6,5% de los trabajadores). No se tiene en cuenta la categoría "0" que representa la no respuesta del encuestado.

7) Nivel educativo 1¶

In [102]:
df_1['NIVEL_ED'].value_counts(normalize=True)
Out[102]:
4-Secundario compl              0.306162
6-Superior universit compl      0.229741
3-Secundario incompl            0.177236
5-Superior universit incompl    0.135879
2-Primario compl                0.118317
1-Primario incompl              0.030568
0-Sin instrucción               0.002098
Name: NIVEL_ED, dtype: float64
In [103]:
fig= px.histogram(df_1,  
                  y='NIVEL_ED', 
                  text_auto='.2s',
                  color_discrete_sequence=['#542344'])
                  
fig.update_layout(
    height = 600,
    width = 1000,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Máximo nivel académico alcanzado',
    xaxis_title = 'Recuento de personas')
fig.show()

Se observa que el 30% de los trabajadores finalizaron el secundario. En segundo lugar se encuentran aquellos con estudios universitarios completos, que componen casi el 23% del dataset.

8) Nivel educativo 2¶

In [104]:
df_1['NIVEL_ED_2'].value_counts(normalize=True)
Out[104]:
4-Secundario                      0.456725
7-Universitario                   0.201031
6-Terciario                       0.152481
2-Primario                        0.143431
5-Polimodal                       0.018401
3-EGB                             0.013546
8-Posgrado universitario          0.012047
1-Jardín/preescolar/Sin instr.    0.002038
0-Educación especial              0.000300
Name: NIVEL_ED_2, dtype: float64
In [105]:
fig= px.histogram(df_1,  
                  y='NIVEL_ED_2', 
                  text_auto='.2s',
                  color_discrete_sequence=['#542344'])
                  
fig.update_layout(
    height = 600,
    width = 1000,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Máximo nivel académico alcanzado-2',
    xaxis_title = 'Recuento de personas')
fig.show()

Si bien esta variable es similar a la anterior, resulta interesante porque incluye la categoría de posgrado, ausente en la variable NIVEL_ED, y además diferencia el nivel de educación superior entre terciario y universitario. Se observa que casi la mitad de los trabajadores del dataset alcanzaron como nivel académico máximo el secundario (completo o incompleto), duplicando incluso a la categoría que le sigue, que son los de nivel universitario. Notar además el bajo porcentaje de personas con estudios de posgrado (aprox. 1,2%).

9) Condición de actividad¶

In [106]:
df_1['ESTADO'].value_counts()
Out[106]:
1    16684
Name: ESTADO, dtype: int64

En la variable 'Estado', es correcto que todos los valores pertenezcan a la categoría 1 que corresponde a "Ocupado", ya que previamente se seleccionaron para el EDA aquellas personas que trabajan. Esta columna se incluirá en el modelado en Machine Learning para definir cuán relevante resulta.

10) Categoría ocupacional¶

In [107]:
df_1['CAT_OCUP'].value_counts()
Out[107]:
Obrero o empleado    12833
Cuenta Propia         3380
Patron                 471
Name: CAT_OCUP, dtype: int64
In [108]:
df_1['CAT_OCUP'].value_counts(normalize=True)
Out[108]:
Obrero o empleado    0.769180
Cuenta Propia        0.202589
Patron               0.028231
Name: CAT_OCUP, dtype: float64
In [109]:
df_1.Sexo.groupby(df_1['CAT_OCUP']).value_counts(normalize=True)
Out[109]:
CAT_OCUP           Sexo
Cuenta Propia      M       0.594083
                   F       0.405917
Obrero o empleado  M       0.536118
                   F       0.463882
Patron             M       0.726115
                   F       0.273885
Name: Sexo, dtype: float64
In [110]:
fig= px.histogram(df_1,  
                  x='CAT_OCUP', 
                  text_auto='.2s',
                  color='Sexo')
fig.update_traces(
    textposition = 'outside',
)
fig.update_layout(
    height = 500,
    width = 500,
    xaxis = {'categoryorder':'total ascending'},
    xaxis_title = 'Categoría ocupacional',
    yaxis_title = 'Recuento de personas',
    title_text="CATEGORÍA OCUPACIONAL SEGÚN SEXO")
fig.show()

Vemos que en cuanto a la categoría ocupacional, el 77% del dataset está compuesto por trabajadores asalariados (obreros o empleados). Le siguen los trabajadores intependientes o cuenta propia, con un 20%. Finalmente, el menor porcentaje lo tienen los patrones (equivalen a menos del 3% del conjunto de datos).

Analizando la distribución por sexo, se da que para todas las categorías hay mayor cantidad de hombres que mujeres. La situación más desigual se da para el caso de los patrones, donde el 72% de ellos son varones.

11) Intensidad de la ocupación¶

In [111]:
df_1['INTENSI'].value_counts()
Out[111]:
2-Ocupado pleno                8784
3-Sobreocupado                 4605
1-Subocupado                   1720
2-Ocup q no trabajó ult sem    1575
Name: INTENSI, dtype: int64
In [112]:
df_1['INTENSI'].value_counts(normalize=True)
Out[112]:
2-Ocupado pleno                0.526492
3-Sobreocupado                 0.276013
1-Subocupado                   0.103093
2-Ocup q no trabajó ult sem    0.094402
Name: INTENSI, dtype: float64
In [113]:
df_1.Sexo.groupby(df_1['INTENSI']).value_counts(normalize=True)
Out[113]:
INTENSI                      Sexo
1-Subocupado                 F       0.541860
                             M       0.458140
2-Ocup q no trabajó ult sem  F       0.587937
                             M       0.412063
2-Ocupado pleno              M       0.522655
                             F       0.477345
3-Sobreocupado               M       0.695331
                             F       0.304669
Name: Sexo, dtype: float64
In [114]:
fig= px.histogram(df_1,  
                  x='INTENSI', 
                  text_auto='.2s',
                  color='Sexo')
fig.update_traces(
    textposition = 'outside',
)
fig.update_layout(
    height = 500,
    width = 600,
    xaxis = {'categoryorder':'total ascending'},
    xaxis_title = 'Categoría ocupacional',
    yaxis_title = 'Recuento de personas',
    title_text="INTENSIDAD OCUPACIONAL SEGÚN SEXO")
fig.show()

Más de la mitad de los trabajadores encuestados declara ser ocupado pleno (el porcentaje asciende a 62% si incluimos los ocupados que no trabajaron en la última semana). Es elevado el porcentaje de personas que dicen estar sobreocupadas (30%). De estos, el 70% son hombres.

12) Tamaño de la empresa¶

In [115]:
df_1['Tamaño_empr'].value_counts()
Out[115]:
0) Ns/Nr               3049
1) 1 pers              2661
2) 2 pers              1386
9) 41 a 100 pers       1378
6) 6 a 10 pers         1319
7) 11 a 25 pers        1236
8) 26 a 40 pers        1216
10) 101 a 200 pers      934
12) Más de 500 pers     891
3) 3 pers               878
11) 201 a 500 pers      683
4) 4 pers               548
5) 5 pers               505
Name: Tamaño_empr, dtype: int64
In [116]:
df_1['Tamaño_empr'].value_counts(normalize=True)
Out[116]:
0) Ns/Nr               0.182750
1) 1 pers              0.159494
2) 2 pers              0.083074
9) 41 a 100 pers       0.082594
6) 6 a 10 pers         0.079058
7) 11 a 25 pers        0.074083
8) 26 a 40 pers        0.072884
10) 101 a 200 pers     0.055982
12) Más de 500 pers    0.053404
3) 3 pers              0.052625
11) 201 a 500 pers     0.040937
4) 4 pers              0.032846
5) 5 pers              0.030269
Name: Tamaño_empr, dtype: float64
In [117]:
df_1.Sexo.groupby(df_1['Tamaño_empr']).value_counts(normalize=True)
Out[117]:
Tamaño_empr          Sexo
0) Ns/Nr             F       0.598885
                     M       0.401115
1) 1 pers            M       0.593386
                     F       0.406614
10) 101 a 200 pers   M       0.570664
                     F       0.429336
11) 201 a 500 pers   M       0.622255
                     F       0.377745
12) Más de 500 pers  M       0.586981
                     F       0.413019
2) 2 pers            M       0.636364
                     F       0.363636
3) 3 pers            M       0.682232
                     F       0.317768
4) 4 pers            M       0.645985
                     F       0.354015
5) 5 pers            M       0.681188
                     F       0.318812
6) 6 a 10 pers       M       0.576952
                     F       0.423048
7) 11 a 25 pers      M       0.538026
                     F       0.461974
8) 26 a 40 pers      F       0.508224
                     M       0.491776
9) 41 a 100 pers     M       0.539913
                     F       0.460087
Name: Sexo, dtype: float64
In [118]:
fig= px.histogram(df_1,  
                  y='Tamaño_empr', 
                  text_auto='.2s',
                  color= 'Sexo')
                  
fig.update_layout(
    height = 800,
    width = 1000,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Tamaño de la empresa',
    xaxis_title = 'Recuento de personas',
    title_text="TAMAÑO DE LA EMPRESA (Se incluye diferenciación por sexo)")
fig.show()

Sin tener en cuenta la categoría No sabe/No responde, se observa que la mayoría de los encuestados trabajan en empresas o negocios unipersonales (16%). En el otro extremo, solo el 5% trabaja en empresas de más de 500 empleados.

Para todos los casos, hay mayor participación de hombres que mujeres, con excepción de las empresas de 26 a 40 empleados, donde la composición según sexo es prácticamente 50/50.

12.1) Tamaño de la empresa 2¶

In [119]:
df_1['Tamaño_empr_2'].value_counts(normalize=True)
Out[119]:
Microempresa    0.358307
Pequeña         0.226025
0) Ns/Nr        0.182750
Mediana         0.138576
Grande          0.094342
Name: Tamaño_empr_2, dtype: float64
In [120]:
df_1.Sexo.groupby(df_1['Tamaño_empr_2']).value_counts(normalize=True)
Out[120]:
Tamaño_empr_2  Sexo
0) Ns/Nr       F       0.598885
               M       0.401115
Grande         M       0.602287
               F       0.397713
Mediana        M       0.552336
               F       0.447664
Microempresa   M       0.628638
               F       0.371362
Pequeña        M       0.536728
               F       0.463272
Name: Sexo, dtype: float64
In [121]:
fig= px.histogram(df_1,  
                  y='Tamaño_empr_2', 
                  text_auto='.2s',
                  color= 'Sexo')
                  
fig.update_layout(
    height = 600,
    width = 700,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Tamaño de la empresa 2',
    xaxis_title = 'Recuento de personas',
    title_text="TAMAÑO DE LA EMPRESA (Se incluye diferenciación por sexo)")
fig.show()

Habiendo hecho una reducción de categorías, ahora vemos que casi el 36% de los trabajadores de la muestra pertenecen a microempresas (recordemos que más arriba las definimos como aquellas con 5 o menos empleados). Luego tenemos que el 23% de los trabajadores se ubican en el rango de las empresas pequeñas (de 6 a 40 empleados). Por su parte, el 14% declara trabajar en empresas medianas (41 a 200 trabajadores) y el 9% en empresas grandes (aquellas con más de 200 trabajadores).

En cuanto a la participación por género, pareciera ser pareja para las pequeñas y medianas empresas, mientras que la brecha se hace más evidente en los extremos (microempresas y grandes empresas). En estas últimas, la composición de varones y mujeres es aproximadamente 60/40.

13) Código de ocupación (acerca de las profesiones y jerarquías)¶

13.1) Carácter ocupacional¶
In [122]:
df_1['CARACTER_OCUP'].value_counts(normalize=True)
Out[122]:
30    0.114541
72    0.103033
10    0.101594
80    0.070367
41    0.068149
56    0.054184
55    0.051846
34    0.045613
40    0.044953
57    0.036262
20    0.035663
53    0.035603
82    0.028111
5     0.023615
48    0.017562
58    0.016003
47    0.014984
32    0.012827
36    0.012467
11    0.009590
92    0.008032
35    0.007792
6     0.007372
31    0.006713
33    0.005934
81    0.005634
3     0.005574
60    0.004735
51    0.004675
46    0.004615
45    0.004435
99    0.004256
42    0.003836
90    0.003836
50    0.003476
44    0.003177
70    0.003117
49    0.003057
43    0.002338
7     0.002038
54    0.001558
61    0.001379
71    0.001079
52    0.000959
62    0.000719
64    0.000719
91    0.000659
2     0.000539
63    0.000360
1     0.000180
4     0.000180
0     0.000060
Name: CARACTER_OCUP, dtype: float64
In [123]:
fig= px.histogram(df_1,  
                  y='CARACTER_OCUP', 
                  text_auto='.2s',
                  color= 'Sexo')
                  
fig.update_layout(
    height = 1300,
    width = 900,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Ocupación',
    xaxis_title = 'Recuento de personas',
    title_text="Ocupaciones de los trabajadores (Se incluye diferenciación por sexo)")
fig.show()

Vemos que las tres ramas de ocupaciones más populares en la muestra son "30- Comercialización directa" (con un 11% de participación), "72- Construcción edilicia, de obras de infraestructura y de redes de distribución de energía, agua potable, gas, telefonía y petróleo" y "10- Gestión administrativa, planificación y control de gestión" (ambas con un porcentaje del 10% respecto del total de los trabajadores).

En cuanto a la distribución según sexo, para la primera y tercera categoría, hay mayor participación por parte de las mujeres (53% y 56% respectivamente), mientras que para la segunda, relativa a la construcción edilicia e infraestructuras de servicios, el grado de participación del género femenino es muy bajo (alrededor del 3%).

13.2) Jerarquía ocupacional¶
In [124]:
df_1['JERARQUIA_OCUP'].value_counts(normalize=True)
Out[124]:
Asalariados      0.731719
Cuenta propia    0.202230
Direccion        0.039499
Jefes            0.022237
Ns/Nr            0.004256
0                0.000060
Name: JERARQUIA_OCUP, dtype: float64
In [125]:
#Reemplazamos el valor cero por Ns/Nr
df_1['JERARQUIA_OCUP']=df_1['JERARQUIA_OCUP'].replace(0,'Ns/Nr')
In [126]:
df_1.Sexo.groupby(df_1['JERARQUIA_OCUP']).value_counts(normalize=True)
Out[126]:
JERARQUIA_OCUP  Sexo
Asalariados     M       0.531455
                F       0.468545
Cuenta propia   M       0.593954
                F       0.406046
Direccion       M       0.676783
                F       0.323217
Jefes           M       0.649596
                F       0.350404
Ns/Nr           M       0.708333
                F       0.291667
Name: Sexo, dtype: float64
In [127]:
fig= px.histogram(df_1,  
                  y='JERARQUIA_OCUP', 
                  text_auto='.2s',
                  color= 'Sexo')
                  
fig.update_layout(
    height = 500,
    width = 900,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Jerarquía ocupacional',
    xaxis_title = 'Recuento de personas',
    title_text="Jerarquía ocupacional (Se incluye diferenciación por sexo)")
fig.show()

El 73% de los trabajadores presenta la jerarquía más baja, como asalariados. Por su parte, entre los jefes y directivos suman el 6% del total del dataset. Notar que para estos últimos, la participación del sexo masculino supera el 65%.

13.3) Tecnología ocupacional¶
In [128]:
df_1['TECNOLOGIA_OCUP'].value_counts(normalize=True)
Out[128]:
Sin op. máq.                0.582055
Op. sist. informatizados    0.264265
Con op. máq.                0.087629
Ns/Nr                       0.065991
0                           0.000060
Name: TECNOLOGIA_OCUP, dtype: float64
In [129]:
#Reemplazamos el valor cero por Ns/Nr
df_1['TECNOLOGIA_OCUP']=df_1['TECNOLOGIA_OCUP'].replace(0,'Ns/Nr')
In [130]:
df_1.Sexo.groupby(df_1['TECNOLOGIA_OCUP']).value_counts(normalize=True)
Out[130]:
TECNOLOGIA_OCUP           Sexo
Con op. máq.              M       0.844049
                          F       0.155951
Ns/Nr                     M       0.669691
                          F       0.330309
Op. sist. informatizados  F       0.548197
                          M       0.451803
Sin op. máq.              M       0.542272
                          F       0.457728
Name: Sexo, dtype: float64
In [131]:
fig= px.histogram(df_1,  
                  y='TECNOLOGIA_OCUP', 
                  text_auto='.2s',
                  color= 'Sexo')
                  
fig.update_layout(
    height = 500,
    width = 900,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Tecnología ocupacional',
    xaxis_title = 'Recuento de personas',
    title_text="Tipo de tecnología empleada por los trabajadores (Se incluye diferenciación por sexo)")
fig.show()

El 68% de las personas trabajan sin operar ningún tipo de máquina, de los cuales el 54% son hombres.

Quienes operan algún tipo de máquina representan solo el 9% de los trabajadores, siendo que el 84% de ellos son varones.

Las personas que trabajan con computadoras representan el 26% de la muestra. Aquí la proporción según sexo se invierte, siendo que las mujeres tienen una participación del 54% para esta categoría.

13.4) Calificación ocupacional¶
In [132]:
df_1['CALIFICACION_OCUP'].value_counts(normalize=True)
Out[132]:
Operativo        0.528710
No calificado    0.209902
Técnicos         0.180652
Profesionales    0.076241
Ns/Nr            0.004495
Name: CALIFICACION_OCUP, dtype: float64
In [133]:
df_1.Sexo.groupby(df_1['CALIFICACION_OCUP']).value_counts(normalize=True)
Out[133]:
CALIFICACION_OCUP  Sexo
No calificado      F       0.583952
                   M       0.416048
Ns/Nr              M       0.706667
                   F       0.293333
Operativo          M       0.640970
                   F       0.359030
Profesionales      M       0.505503
                   F       0.494497
Técnicos           F       0.527870
                   M       0.472130
Name: Sexo, dtype: float64
In [134]:
fig= px.histogram(df_1,  
                  y='CALIFICACION_OCUP', 
                  text_auto='.2s',
                  color= 'Sexo')
                  
fig.update_layout(
    height = 500,
    width = 900,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Calificación ocupacional',
    xaxis_title = 'Recuento de personas',
    title_text="Calificación de los trabajadores (Se incluye diferenciación por sexo)")
fig.show()

Entre el personal operativo y no calificado se incluyen el 73% de los trabajadores (esto es coherente con lo analizado más arriba en la variable Jerarquía ocupacional).

Por su parte, los técnicos representan el 18% de la muestra, mientras que los profesionales menos del 8%. Para ambas categorías, la distribución según sexo es equitativa.

14) Código de actividad económica¶

14.1) Actividad Económica¶
In [135]:
df_1['ACTIV_ECON'].value_counts(normalize=True)
Out[135]:
48    0.153800
84    0.134980
40    0.100396
85    0.088947
97    0.072165
        ...   
63    0.000300
50    0.000240
39    0.000060
12    0.000060
37    0.000060
Name: ACTIV_ECON, Length: 77, dtype: float64
In [136]:
fig= px.histogram(df_1,  
                  y='ACTIV_ECON', 
                  text_auto='.2s',
                  color= 'Sexo')
                  
fig.update_layout(
    height = 1400,
    width = 1000,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Código de actividad económica',
    xaxis_title = 'Recuento de personas',
    title_text="Actividad económica del lugar de trabajo (Se incluye diferenciación por sexo)")
fig.show()

Analizando las tres primeras categorías, se observa que el 15% de las personas trabajan en empreas dedicadas a "48- Comercio, Excepto de Vehículos Automotores y Motocicletas". La segunda categoría en orden de importancia, con un porcentaje del 13%, es "84- Administración Pública y Defensa; Planes de Seguro Social Obligatorio". La tercer actividad económica es "40- Construcción", con un 10% de participación.

Notar que esta variable guarda relación con la variable 'Carácter ocupacional'. En ambas, las tres primeras categorías son muy similares.

14.2) Categoría Económica¶

Esta variable se creó a fin de agrupar las actividades económicas en categorías más generales, para simplificar el análisis.

In [137]:
df_1['CAT_ECON'].value_counts(normalize=True)
Out[137]:
G    0.181012
O    0.134980
C    0.104531
F    0.100396
P    0.088947
T    0.072165
Q    0.065332
H    0.043695
S    0.041117
N    0.039139
I    0.029070
M    0.027811
R    0.017801
J    0.016603
K    0.013965
W    0.008691
E    0.006293
L    0.004076
D    0.003896
A    0.000480
Name: CAT_ECON, dtype: float64
In [138]:
df_1.Sexo.groupby(df_1['CAT_ECON']).value_counts(normalize=True)
Out[138]:
CAT_ECON  Sexo
A         M       0.875000
          F       0.125000
C         M       0.692661
          F       0.307339
D         M       0.876923
          F       0.123077
E         M       0.723810
          F       0.276190
F         M       0.968955
          F       0.031045
G         M       0.585762
          F       0.414238
H         M       0.877915
          F       0.122085
I         M       0.511340
          F       0.488660
J         M       0.689531
          F       0.310469
K         F       0.506438
          M       0.493562
L         M       0.514706
          F       0.485294
M         F       0.500000
          M       0.500000
N         M       0.693721
          F       0.306279
O         M       0.550178
          F       0.449822
P         F       0.725741
          M       0.274259
Q         F       0.742202
          M       0.257798
R         M       0.659933
          F       0.340067
S         F       0.530612
          M       0.469388
T         F       0.973422
          M       0.026578
W         M       0.682759
          F       0.317241
Name: Sexo, dtype: float64
In [139]:
fig= px.histogram(df_1,  
                  y='CAT_ECON', 
                  text_auto='.2s',
                  color= 'Sexo')
                  
fig.update_layout(
    height = 900,
    width = 800,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Código de categoría económica',
    xaxis_title = 'Recuento de personas',
    title_text="Categoría económica del lugar de trabajo (Se incluye diferenciación por sexo)")
fig.show()

Continuando con el criterio de analizar las tres primeras categorías, se observa que el 18% de las personas trabajan en empreas dedicadas a "G- Comercio al por Mayor y al por Menor; Reparación de Vehículos, Automotores y Motocicletas". La segunda categoría en orden de importancia, con un porcentaje del 13%, es "O- Administración Pública y Defensa; Planes de Seguro Social Obligatorio". Estas dos primeras categorías coinciden con la variable anterior.

Sin embargo, en este caso, el tercer lugar está prácticamente compartido por los grupos "C- Industria Manufacturera" y "F- Construcción", ambos con un 10% de participación. Este cambio se debe a que la categoría Industria Manufacturera abarca 24 actividades económicas, que al sumarlas cobran mayor relevancia.

En cuanto a las diferencias por género, por un lado, la categoría con menor participación de mujeres es "F- Construcción", siendo ésta igual a 3%. Por el otro lado, la categoría con menor participación de hombres es "T- Actividades de los Hogares como Empleadores de Personal Doméstico; Actividades de los Hogares como Productores de Bienes o Servicios para Uso Propio", siendo ésta inferior a 3%.

15) Cantidad de ocupaciones¶

In [140]:
df_1['Cant_Ocup'].value_counts(normalize=True)
Out[140]:
1    0.914469
2    0.074682
3    0.007252
4    0.001798
5    0.001259
6    0.000360
7    0.000180
Name: Cant_Ocup, dtype: float64
In [141]:
#Convertimos la columna 'Cant_Ocup'a tipo entero.
df_1['Cant_Ocup']=df_1['Cant_Ocup'].astype(int)

#Convertimos la columna 'Cant_Ocup'a tipo texto.
df_1['Cant_Ocup']=df_1['Cant_Ocup'].astype(str)
In [142]:
df_1.Sexo.groupby(df_1['Cant_Ocup']).value_counts(normalize=True)
Out[142]:
Cant_Ocup  Sexo
1          M       0.564331
           F       0.435669
2          F       0.544141
           M       0.455859
3          F       0.661157
           M       0.338843
4          F       0.766667
           M       0.233333
5          F       0.857143
           M       0.142857
6          F       0.833333
           M       0.166667
7          F       1.000000
Name: Sexo, dtype: float64
In [143]:
fig= px.histogram(df_1,  
                  y='Cant_Ocup', 
                  text_auto='.2s',
                  color= 'Sexo')
                  
fig.update_layout(
    height = 500,
    width = 800,
    yaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Cantidad de ocupaciones',
    xaxis_title = 'Recuento de personas',
    title_text="Cantidad de ocupaciones (Se incluye diferenciación por sexo)")
fig.show()

Vemos que el 91% de las personas tienen una sola ocupación, mientras que del resto, el 7% afirman tener dos ocupaciones y solo menos del 1% declara tener 3 ocupaciones.

16) Edad (en rangos)¶

In [144]:
df_1['Edad_2'].value_counts(normalize=True)
Out[144]:
(38.0, 46.0]      0.216795
(30.0, 38.0]      0.214097
(46.0, 55.0]      0.184968
(24.0, 30.0]      0.157336
(55.0, 65.0]      0.126468
(17.999, 24.0]    0.100336
Name: Edad_2, dtype: float64
In [145]:
df_1.Sexo.groupby(df_1['Edad_2']).value_counts(normalize=True)
Out[145]:
Edad_2          Sexo
(17.999, 24.0]  M       0.614098
                F       0.385902
(24.0, 30.0]    M       0.564190
                F       0.435810
(30.0, 38.0]    M       0.550952
                F       0.449048
(38.0, 46.0]    M       0.539674
                F       0.460326
(46.0, 55.0]    M       0.512962
                F       0.487038
(55.0, 65.0]    M       0.577251
                F       0.422749
Name: Sexo, dtype: float64
In [146]:
fig, axes = plt.subplots(figsize=(9,6))

  #grafico de barras
  sns.countplot(x=df_1['Edad_2'], color='#08519C')
  axes.set_title('Recuento de trabajadores según edad')
  
  plt.show()

Al dividir la variable edad en 6 rangos y transformarla en categórica, los rangos que mayor cantidad de trabajadores tienen son 30-38 años y 38-46 años, cada uno representando el 21% de la muestra. La participación de mujeres en estos intervalos ronda el 45%.

Recordemos que previamente, para el EDA, se seleccionaron solo aquellas personas de entre 18 y 65 años de edad.

Análisis bivariado¶

Es una de las formas más simples de análisis estadístico, que se utiliza para averiguar si existe una relación entre dos conjuntos de valores. Por lo general involucra las variables X e Y.

El procedimiento consistirá en ir comparando todas las variables de interés con la variable objetivo "Ingresos", a fin de indagar sobre la veracidad de las hipótesis planteadas e identificar patrones que luego serán tenidos en cuenta en la construcción del modelo predictivo.

1) Categoría ocupacional vs. Ingresos¶

In [147]:
fig = make_subplots(rows=1, cols=2,
                    subplot_titles=('Histograma', "Boxplot"))

fig.add_trace(
    go.Histogram( x=df_1['CAT_OCUP'], 
                  y=df_1['Ingresos'],
                  histfunc='avg' 
     ),
     row=1, col=1

)

fig.add_trace(
    go.Box( x=df_1['CAT_OCUP'], 
            y=df_1['Ingresos'],
            boxmean='sd', # represententa  el promedio y la desviación estándar
            boxpoints='suspectedoutliers', # sólo outliers sospechosos
            marker=dict(
            color='rgba(219, 64, 82, 0.6)',
            outliercolor='rgb(8,81,156)',
            line=dict(
            outliercolor='rgb(8,81,156)',
            outlierwidth=2)),
            line_color='rgb(8,81,156)'   
      ),
    row=1, col=2
)


fig.update_layout(height=600, width=1200, title_text="CATEGORÍA OCUPACIONAL VS. INGRESOS",
                  yaxis_title = 'Ingresos en miles de pesos argentinos',
                  xaxis_title = None, showlegend=False,
                  xaxis = {'categoryorder':'total ascending'})
fig.show()

Recordemos cuál es la Hipótesis 1 planteada:

A partir de histograma, podemos ver que en promedio, los trabajadores en relación de dependencia (obreros o empleados) ganan unos $18.000 más que los cuentapropistas, es decir, aproximadamente un 43% más. Sin embargo la máxima brecha se da entre patrones y cuentapropistas, siendo que los primeros ganan más del doble que los segundos.

Además, si consideramos tanto a los patrones como a los cuentapropistas dentro del grupo de independientes, según los boxplots, existe mayor variabilidad en los ingresos de los trabajadores independientes, dados los valores de las desviaciones estándar para cada grupo, por lo que la hipótesis 1 planteada al inicio podría ser cierta.

2) Sexo vs. Ingresos¶

In [148]:
fig= px.histogram(df_1, 
                  x='Sexo', 
                  y='Ingresos',  
                  hover_name='Ingresos', 
                  histfunc='avg', text_auto='.2s', 
                  color_discrete_sequence=px.colors.sequential.RdBu
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 600,
    width = 400,
    title={
        'text': 'INGRESOS SEGÚN SEXO',
        },
    xaxis = {'categoryorder':'category ascending'},
    yaxis_title = 'Ingresos en miles de pesos argentinos',
    xaxis_title = None,
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Sin ahondar en otras variables, vemos que existe una profunda brecha salarial de género. Los datos indican que, en promedio, los hombres ganan un 29% más que las mujeres.

In [149]:
fig= px.histogram(df_1, 
                  x='CAT_OCUP', 
                  y='Ingresos',  
                  hover_name='Ingresos', 
                  color= 'Sexo',
                  barmode='group',
                  histfunc='avg', text_auto='.2s', 
                  color_discrete_sequence=px.colors.sequential.RdBu
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 700,
    width = 600,
    title={
        'text': 'INGRESOS SEGÚN SEXO Y CATEGORÍA OCUPACIONAL',
        },
    xaxis = {'categoryorder':'total ascending'},
    yaxis_title = 'Ingresos en miles de pesos argentinos',
    xaxis_title = None,
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Recordemos cuál es la Hipótesis 2 planteada:

Para los patrones, que son quienes perciben los ingresos más altos, la brecha de género es del 15,8% a favor de los varones.

En el caso de los trabajadores asalariados (obreros o empleados) la diferencia alcanza el valor de 29,6%.

Para la categoría de cuentapropistas, la brecha se profundiza aún más, ganando los hombres un 34,8% más que las mujeres.

Por lo tanto, la Hipótesis 2 pareciera ser completamente cierta no solo para la categoría de asalariados sino también para los trabajadores independientes.

A continuación iremos contrastando el resto de las variables con la variable ingresos, y, en algunos casos, ampliaremos cada gráfico por categoría ocupacional y por sexo.

3) Región del país en la que reside vs. Ingresos¶

In [150]:
fig= px.histogram(df_1, 
                  x='REGION', 
                  y='Ingresos',  
                  hover_name='Ingresos', 
                  histfunc='avg', text_auto='.2s', 
                  color_discrete_sequence=['#22516D']
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 500,
    width = 700,
    title={
        'text': 'LUGAR DE RESIDENCIA VS. INGRESOS',
        },
    xaxis = {'categoryorder':'total descending'},
    yaxis_title = 'Ingresos en miles de pesos argentinos',
    xaxis_title = 'Región del país',
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)
In [151]:
fig= px.box(df_1, 
                  x='REGION', 
                  y='Ingresos',  
                  hover_name='Ingresos', 
                  color='REGION',
                  color_discrete_sequence=['#8E8DBE','#7A306C','#96F550','#845F95','#A9E4EF','#81F495']
      )
fig.update_xaxes(categoryorder='array', categoryarray= ['PATAGONIA','GBA','PAMPEANA','CUYO', 'NEA','NOA'])
fig.update_layout(
    height = 500,
    width = 700,
    title={
        'text': 'LUGAR DE RESIDENCIA VS. INGRESOS',
        },
    yaxis_title = 'Ingresos en miles de pesos argentinos',
    xaxis_title = 'Región del país',
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Recordemos cuál es la Hipótesis 3 planteada:

De los gráficos obtenemos los siguientes insights:

• Existe una marcada diferencia entre los ingresos percibidos según la región del país en que residen los trabajadores.

• Una persona que vive en la región patagónica gana, en promedio, un 85% más que alguien que trabaja en el noroeste argentino.

• No se verifica la hipótesis 3, en la que se supuso que aquellos trabajadores que residen en la capital del país y alrededores ganan más dinero que el resto de las regiones. Las personas que más ingresos perciben son, en promedio, quienes viven en la Patagonia, superando en un 14% a los ingresos de la región del Gran Buenos Aires.

In [152]:
fig= px.histogram(df_1, 
                  y='REGION', 
                  x='Ingresos',  
                  hover_name='Ingresos', 
                  color= 'CAT_OCUP',
                  barmode='group',
                  histfunc='avg', text_auto='.2s', 
                  color_discrete_sequence=['#8E8DBE', '#475669','#7A306C']
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 500,
    width = 900,
    title={
        'text': 'LUGAR DE RESIDENCIA VS. INGRESOS- Según categoría ocupacional',
        },
    yaxis = {'categoryorder':'total ascending'},
    xaxis_title = 'Ingresos en miles de pesos argentinos',
    yaxis_title = 'Región del país',
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Comparando trabajadores asalariados versus cuentapropistas, la brecha máxima se da en la región de Cuyo, donde un obrero o empleado gana un 47% más que alguien que trabaja de forma independiente y que no es patrón.

In [153]:
fig= px.histogram(df_1, 
                  x='REGION', 
                  y='Ingresos',  
                  hover_name='Ingresos', 
                  color= 'Sexo',
                  barmode='group',
                  histfunc='avg', text_auto='.2s', 
                  color_discrete_sequence=['#845F95', '#81F495']
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 500,
    width = 700,
    title={
        'text': 'LUGAR DE RESIDENCIA VS. INGRESOS- Según sexo',
        },
    xaxis = {'categoryorder':'total descending'},
    yaxis_title = 'Ingresos en miles de pesos argentinos',
    xaxis_title = 'Región del país',
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

La región más desigual en cuanto a género es Gran Buenos Aires, para la cual la brecha salarial promedio entre varones y mujeres es del 34%, a favor de los primeros.

A continuación, analizaremos cuáles son los aglomerados urbanos con más y menos ingresos.

In [154]:
fig= px.histogram(df_1, 
                  y='AGLOMERADO', 
                  x='Ingresos', 
                  color='MAS_500', 
                  hover_name='Ingresos', 
                  histfunc='avg', text_auto='.2s'
                  
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 700,
    width = 600,
    title={
        'text': 'AGLOMERADO URBANO VS. INGRESOS (Color según tamaño)',
        },
    yaxis = {'categoryorder':'total ascending'},
    xaxis_title = 'Ingresos en miles de pesos argentinos',
    yaxis_title = 'Código de aglomerado',
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Los tres aglomerados urbanos con mayores ingresos son:

  1. "09- Comodoro Rivadavia-Rada Tilly", situado en la provincia de Chubut.
  2. "31- Ushuaia- Río Grande", situado en la provincia de Tierra del Fuego, en el extremo sur de Argentina.

Ambos pertenecen a la región patagónica. En tamaño, están clasificados como menor a 500 mil habitantes.

  1. "32- Ciudad Autónoma de Buenos Aires", capital del país. Está ubicada dentro de la región del Gran Buenos Aires, y clasificada como aglomerado urbano de más de 500 mil habitantes. Según datos del censo nacional del año 2022, en ella viven más de 3,12 millones de personas. El ingreso promedio de esta ciudad es apenas 3,5% menor que el de los aglomerados que encabezan el gráfico.

Los tres aglomerados urbanos con menores ingresos son:

  • "25-La Rioja", perteneciente a la región del Noroeste Argentino.
  • "18- Santiago del Estero- La Banda", también del Noroeste Argentino.
  • "Gran Resistencia", situado en la provincia del Chaco, región del Noreste Argentino.

Todos ellos son aglomerados con menos de 500 mil personas y presentan un ingreso promedio de 38 mil pesos.

Se podría suponer, entonces, que alguien que reside en alguna de estas ciudades de la región patagónica, al sur del país, gana en promedio 2,5 más que quien reside en los tres últimos aglomerados mencionados, situados más al norte del país.

In [155]:
fig= px.histogram(df_1, 
                  y='AGLOMERADO', 
                  x='Ingresos', 
                  color='REGION', 
                  hover_name='Ingresos', 
                  histfunc='avg', text_auto='.2s',
                          
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 700,
    width = 600,
    title={
        'text': 'AGLOMERADO URBANO VS. INGRESOS (Color según Región)',
        },
    yaxis = {'categoryorder':'total ascending'},
    xaxis_title = 'Ingresos en miles de pesos argentinos',
    yaxis_title = 'Código de aglomerado',
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Al distinguir los aglomerados por color según su región de pertenencia, es posible observar las desigualdades que se dan dentro de una misma región.

Por ejemplo, la diferencia que se da en la región del Gran Buenos Aires, entre CABA y el resto de los Partidos del GBA. Situación similar se da en Cuyo, donde los habitantes de "26- Gran San Luis", perciben ingresos considerablemente mayores a los de "10- Gran Mendoza" o "27- Gran San Juan".

4) Nivel educativo vs. Ingresos¶

Realizaremos el gráfico para la primer variable de nivel educativo: 'NIVEL_ED'.

In [156]:
fig= px.box(df_1, 
                  x='NIVEL_ED', 
                  y='Ingresos',  
                  hover_name='Ingresos',  
                  color_discrete_sequence=['#3A182E']
      )

fig.update_layout(
    height = 500,
    width = 800,
    title={
        'text': 'NIVEL EDUCATIVO VS. INGRESOS',
        },
    yaxis_title = 'Ingresos en miles de pesos argentinos',
    xaxis_title = None,
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Ahora realizaremos el gráfico para la segunda variable de nivel educativo: 'NIVEL_ED_2'.

In [157]:
fig= px.box(df_1, 
                  x='NIVEL_ED_2', 
                  y='Ingresos',  
                  hover_name='Ingresos',  
                  color_discrete_sequence=['#3A182E']
      )

fig.update_layout(
    height = 500,
    width = 800,
    title={
        'text': 'NIVEL EDUCATIVO (Variable NIVEL_ED_2) VS. INGRESOS',
        },
    xaxis = {'categoryorder':'category descending'},
    yaxis_title = 'Ingresos en miles de pesos argentinos',
    xaxis_title = None,
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Recordemos cuál es la Hipótesis 4 planteada:

Analizando ambos gráficos, la hipótesis 4 pareciera ser cierta, es decir, a mayor nivel educativo alcanzado, mayores son los ingresos del trabajador. Esto se pone especialmente en evidencia cuando incorporamos la categoría de posgrado. Quien tiene estudios de posgrado gana, en promedio, casi 4 veces más que quien solo asistió al nivel inicial (jardín) o que no tiene instrucción.

In [158]:
fig= px.histogram(df_1, 
                  y='NIVEL_ED_2', 
                  x='Ingresos',  
                  hover_name='Ingresos', 
                  color= 'CAT_OCUP',
                  barmode='group',
                  histfunc='avg', text_auto='.2s', 
                  color_discrete_sequence=['#D6A278', '#8E8DBE','#3A182E']
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 800,
    width = 1000,
    title={
        'text': 'NIVEL EDUCATIVO VS. INGRESOS- Según categoría ocupacional',
        },
    yaxis = {'categoryorder':'total ascending'},
    xaxis_title = 'Ingresos en miles de pesos argentinos',
    yaxis_title = None,
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

En primer lugar, se observa que quienes mayores ingresos perciben son patrones o empleados con nivel de posgrado.

Las máximas brechas entre categorías ocupacionales se dan para el nivel terciario, donde un patrón gana 2,2 veces más que un cuentapropista, mientras que un empleado gana un 55% más que quien trabaja por su cuenta.

In [159]:
fig= px.histogram(df_1, 
                  x='NIVEL_ED_2', 
                  y='Ingresos',  
                  hover_name='Ingresos', 
                  color= 'Sexo',
                  barmode='group',
                  histfunc='avg', text_auto='.2s', 
                  color_discrete_sequence=['#8E8DBE','#3A182E']
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 500,
    width = 900,
    title={
        'text': 'NIVEL EDUCATIVO VS. INGRESOS- Según sexo',
        },
    xaxis = {'categoryorder':'total descending'},
    yaxis_title = 'Ingresos en miles de pesos argentinos',
    xaxis_title = None,
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Sin tener en cuenta la categoría de educación especial, la máxima brecha salarial de género se da para quienes tienen estudios de nivel primario, siendo ésta igual a 80%. De todos modos, es llamativo el hecho de que la brecha prácticamente no se reduce a mayor nivel educativo, siendo que, para el nivel de posgrado, los hombres ganan en promedio un 65% más que las mujeres.

Bloque con sangría

5.1) Edad (sin diferenciar en rangos) vs. Ingresos¶

In [160]:
plt.figure(figsize=(12,6))
sns.lineplot(data=df_1, x='Edad', y= 'Ingresos', color='#542344')
plt.title("Ingresos vs. Edad", fontsize=15, verticalalignment='bottom');
plt.xlabel("Edad");
plt.ylabel("Ingresos en pesos argentinos")
Out[160]:
Text(0, 0.5, 'Ingresos en pesos argentinos')

Recordemos cuál es la Hipótesis 5 planteada:

Se observa en la curva de ingresos vs. edad, que los ingresos van en aumento hasta aproximadamente los 55 años, valor a partir del cual comienzan a descender. Por lo tanto, la hipótesis 5 pareciera ser cierta solo hasta esa edad, y no para los últimos 10 años de vida laboral.

In [161]:
plt.figure(figsize=(10,8))
sns.lineplot(data=df_1, x='Edad', y= 'Ingresos', hue='CAT_OCUP', palette= 'rocket')
plt.title("Ingresos Trabajadores vs. Edad- Según categoría ocupacional", fontsize=15, verticalalignment='bottom');
plt.xlabel("Edad");
plt.ylabel("Ingresos en pesos argentinos")
Out[161]:
Text(0, 0.5, 'Ingresos en pesos argentinos')

Vemos que existe una brecha de ingresos entre asalariados y cuentapropistas que se mantiene en el transcurso de la vida de las personas. Ampliaremos sobre esto en el análisis de la variable edad discretizada en rangos.

Además, la curva de ingresos de los patrones es más variable y presenta un aspecto cíclico, por momentos intersectando en sus picos inferiores a la curva de los asalariados.

In [162]:
plt.figure(figsize=(10,8))
sns.lineplot(data=df_1, x='Edad', y= 'Ingresos', hue='Sexo', palette= ['#843B62','#201E50'])
plt.title("Ingresos Trabajadores vs. Edad- Según sexo", fontsize=15, verticalalignment='bottom');
plt.xlabel("Edad");
plt.ylabel("Ingresos en pesos argentinos")
Out[162]:
Text(0, 0.5, 'Ingresos en pesos argentinos')

Vemos que existe una brecha salarial de género que se mantiene en el transcurso de la vida de las personas. Ampliaremos sobre esto en el análisis de la variable edad discretizada en rangos.

5.2) Edad (diferenciada en rangos) vs. Ingresos¶

In [163]:
plt.figure(figsize=(8,6))
sns.barplot(y='Ingresos',x='Edad_2',data=df_1, color='#542344')
plt.title("INGRESOS VS. EDAD", color='black')
plt.xlabel("Rango etáreo",color='black')
plt.ylabel("Ingresos en pesos argentinos",color='black')
Out[163]:
Text(0, 0.5, 'Ingresos en pesos argentinos')
In [164]:
df_1.Ingresos.groupby(df_1['Edad_2']).mean()
Out[164]:
Edad_2
(17.999, 24.0]    31809.354839
(24.0, 30.0]      47213.657524
(30.0, 38.0]      56590.521837
(38.0, 46.0]      60406.821122
(46.0, 55.0]      65277.970188
(55.0, 65.0]      60541.764929
Name: Ingresos, dtype: float64

Se observa que los ingresos máximos se dan para la franja etárea de 46 a 55 años, duplicando en promedio a los ingresos de la franja 18 a 24 años.

In [165]:
df_1.groupby(['Edad_2','CAT_OCUP']).agg({'Ingresos':['mean','median']}).reset_index()
Out[165]:
Edad_2 CAT_OCUP Ingresos
mean median
0 (17.999, 24.0] Cuenta Propia 23282.183908 20000.0
1 (17.999, 24.0] Obrero o empleado 33370.864286 30000.0
2 (17.999, 24.0] Patron 34846.153846 30000.0
3 (24.0, 30.0] Cuenta Propia 33440.326340 30000.0
4 (24.0, 30.0] Obrero o empleado 49203.109513 40000.0
5 (24.0, 30.0] Patron 86762.195122 50000.0
6 (30.0, 38.0] Cuenta Propia 42608.691843 30000.0
7 (30.0, 38.0] Obrero o empleado 59377.663949 50000.0
8 (30.0, 38.0] Patron 72247.191011 60000.0
9 (38.0, 46.0] Cuenta Propia 44866.800000 35000.0
10 (38.0, 46.0] Obrero o empleado 63865.078267 58000.0
11 (38.0, 46.0] Patron 83152.380952 60000.0
12 (46.0, 55.0] Cuenta Propia 45650.911854 35500.0
13 (46.0, 55.0] Obrero o empleado 69558.817708 58500.0
14 (46.0, 55.0] Patron 89887.096774 70000.0
15 (55.0, 65.0] Cuenta Propia 41439.669421 30000.0
16 (55.0, 65.0] Obrero o empleado 66453.146515 60000.0
17 (55.0, 65.0] Patron 93323.232323 70000.0
In [166]:
plt.figure(figsize=(12,7))
sns.barplot(y='Ingresos',x='Edad_2',data=df_1, hue='CAT_OCUP', palette= 'rocket')
plt.title("INGRESOS VS. EDAD- Según categoría ocupacional", color='black')
plt.xlabel("Rango etáreo",color='black')
plt.ylabel("Ingresos en pesos argentinos",color='black')
Out[166]:
Text(0, 0.5, 'Ingresos en pesos argentinos')

Teniendo en cuenta que la edad jubilatoria en Argentina es 60 y 65 años para mujeres y hombres respectivamente, los trabajadores asalariados ganan en promedio más dinero que los cuentapropistas a lo largo de toda la vida laboral activa. Esta brecha se hace más notoria en el rango etáreo de 55 a 65 años, alcanzando un valor de 60%.

Por su parte, los patrones son quienes ganan más dinero a lo largo de todas las edades.

In [167]:
plt.figure(figsize=(8,6))
sns.barplot(y='Ingresos',x='Edad_2',data=df_1, hue='Sexo', palette= ['#843B62','#201E50'])
plt.title("INGRESOS VS. EDAD- Según sexo", color='black')
plt.xlabel("Rango etáreo",color='black')
plt.ylabel("Ingresos en pesos argentinos",color='black')
Out[167]:
Text(0, 0.5, 'Ingresos en pesos argentinos')
In [168]:
df_1.groupby(['Edad_2','Sexo']).agg({'Ingresos':['mean','median']}).reset_index()
Out[168]:
Edad_2 Sexo Ingresos
mean median
0 (17.999, 24.0] F 27738.049536 22000.0
1 (17.999, 24.0] M 34367.782101 30000.0
2 (24.0, 30.0] F 42710.140734 35000.0
3 (24.0, 30.0] M 50692.403781 43200.0
4 (30.0, 38.0] F 47978.175810 40000.0
5 (30.0, 38.0] M 63609.933943 52000.0
6 (38.0, 46.0] F 51525.339339 45000.0
7 (38.0, 46.0] M 67982.470287 60000.0
8 (46.0, 55.0] F 54901.687957 45000.0
9 (46.0, 55.0] M 75129.866709 60000.0
10 (55.0, 65.0] F 49429.791480 40000.0
11 (55.0, 65.0] M 68679.597701 60000.0

En cuanto a género, se evidencia que los hombres perciben más ingresos que las mujeres a lo largo de toda la vida laboral activa. La brecha máxima ocurre en el rango etáreo de 55 a 65 años, es decir, al acercarse los trabajadores a la edad jubilatoria, siendo ésta igual al 39%.

6) Horas semanales trabajadas vs. Ingresos¶

In [169]:
plt.figure(figsize=(12,6))
sns.scatterplot(data=df_1, x='Horas_sem', y= 'Ingresos', color= '#B2BCD1')
sns.lineplot(data=df_1, x='Horas_sem', y= 'Ingresos', color= '#542344')
plt.title("Ingresos vs Horas semanales trabajadas", fontsize=15, verticalalignment='bottom');
plt.xlabel("Horas semanales");
plt.ylabel("Ingresos en millones de pesos argentinos")
Out[169]:
Text(0, 0.5, 'Ingresos en millones de pesos argentinos')
In [170]:
#Mismo gráfico que el anterior pero en plotly, para poder hacer zoom e indagar sobre la pendiente de la líena de tendencia.
fig = px.scatter(df_1,
                 x="Horas_sem", 
                 y="Ingresos",
                 trendline="lowess",
                 trendline_color_override= 'black',
                 title= 'Ingresos según horas semanales trabajadas'
                 )
fig.update_layout(
    height = 500,
    width = 900,
    title={
        'xanchor': 'center',
        'yanchor': 'top',
        'y':0.97,
        'x':0.5,
        },
    yaxis_title = 'Ingresos en millones de pesos argentinos'
    )
fig.update_traces(marker=dict(size=3,
                              color='#8CACAC'))
                  
fig.show()
In [171]:
fig = px.scatter(df_1, 
                 x="Horas_sem", 
                 y="Ingresos", 
                 facet_col="Sexo",
                 color='Sexo',  
                 trendline="lowess",
                 trendline_color_override= 'black',  
                 title= 'Ingresos según horas semanales trabajadas según sexo'
                 )
fig.update_layout(
    height = 400,
    width = 1150,
    title={
        'xanchor': 'center',
        'yanchor': 'top',
        'y':0.97,
        'x':0.5,
        },
    yaxis_title = 'Ingresos en millones de pesos argentinos'
    )
fig.update_traces(marker=dict(size=4,
                              ))
fig.show()
In [172]:
fig = px.scatter(df_1, 
                 x="Horas_sem", 
                 y="Ingresos", 
                 facet_col="CAT_OCUP",
                 color='CAT_OCUP',  
                 trendline="lowess",
                 trendline_color_override= 'black',  
                 title= 'Ingresos según horas semanales trabajadas según categoría ocupacional'
                 )
fig.update_layout(
    height = 400,
    width = 1150,
    title={
        'xanchor': 'center',
        'yanchor': 'top',
        'y':0.97,
        'x':0.5,
        },
    yaxis_title = 'Ingresos en millones de pesos argentinos'
    )
fig.update_traces(marker=dict(size=4,
                              color='#7C98B3'))
fig.show()

Recordemos cuál es la Hipótesis 6 planteada:

Al hacer zoom en los gráficos, las curvas de ingresos vs. horas semanales trabajadas presentan las siguientes características:

  • Tramo 0-19 hs semanales: Para el caso de mujeres, hombres, trabajadores asalariados y patrones, la curva presenta una pendiente ligeramente negativa. Para los cuentapropistas, la pendiente es positiva.
  • Tramo 19-40 hs semanales: La pendiente es positiva para las categorías mujeres, hombres, trabajadores asalariados y cuenta propia. No está tan claro el caso de los patrones.
  • Tramo +40hs: La pendiente de la curva es negativa para mujeres y cuentapropistas, y se mantiene constante para hombres y asalariados, mientras que es variable para los patrones.

Como conclusión, no está tan claro cómo influirá esta variable a la hora de predecir los ingresos. Se definirá en la etapa de modelado.

7) Intensidad de la ocupación vs. Ingresos¶

In [173]:
fig = make_subplots(rows=1, cols=2,
                    subplot_titles=('Histograma', "Boxplot"))

fig.add_trace(
    go.Histogram( x=df_1['INTENSI'], 
                  y=df_1['Ingresos'],
                  histfunc='avg' 
     ),
     row=1, col=1

)

fig.add_trace(
    go.Box( x=df_1['INTENSI'], 
            y=df_1['Ingresos'],
            boxmean='sd', # represententa  el promedio y la desviación estándar
            boxpoints='suspectedoutliers', # sólo outliers sospechosos
            marker=dict(
            color='rgba(219, 64, 82, 0.6)',
            outliercolor='rgb(8,81,156)',
            line=dict(
            outliercolor='rgb(8,81,156)',
            outlierwidth=2)),
            line_color='rgb(8,81,156)'   
      ),
    row=1, col=2
)


fig.update_layout(height=600, width=1200, title_text="INTENSIDAD DE LA OCUPACIÓN VS. INGRESOS",
                  yaxis_title = 'Ingresos en miles de pesos argentinos',
                  xaxis_title = None, showlegend=False,
                  xaxis = {'categoryorder':'total descending'})
fig.show()

Cantidad de ocupaciones:

In [174]:
fig = make_subplots(rows=1, cols=2,
                    subplot_titles=('Histograma', "Boxplot"))

fig.add_trace(
    go.Histogram( x=df_1['Cant_Ocup'], 
                  y=df_1['Ingresos'],
                  histfunc='avg' 
     ),
     row=1, col=1

)

fig.add_trace(
    go.Box( x=df_1['Cant_Ocup'], 
            y=df_1['Ingresos'],
            boxmean='sd', # represententa  el promedio y la desviación estándar
            boxpoints='suspectedoutliers', # sólo outliers sospechosos
            marker=dict(
            color='rgba(219, 64, 82, 0.6)',
            outliercolor='rgb(8,81,156)',
            line=dict(
            outliercolor='rgb(8,81,156)',
            outlierwidth=2)),
            line_color='rgb(8,81,156)'   
      ),
    row=1, col=2
)


fig.update_layout(height=600, width=1200, title_text="CANTIDAD DE OCUPACIONES VS. INGRESOS",
                  yaxis_title = 'Ingresos en miles de pesos argentinos',
                  xaxis_title = None, showlegend=False,
                  xaxis = {'categoryorder':'total descending'})
fig.show()

Recordemos cuál es la Hipótesis 7 planteada:

Esta hipótesis parece verificarse, como es de suponer, si analizamos la diferencia entre ingresos percibidos por subocupados y ocupados pleno, siendo que los segundos ganan casi 2,4 veces más que los primeros. Sin embargo, la brecha se achica entre sobreocupados y ocupados pleno: la diferencia es igual a 15% a favor de los sobreocupados.

Al analizar la variable Cantidad de ocupaciones, vemos que a mayor cantidad de ocupaciones, menor es el ingreso promedio de los trabajadores. Asimismo, para el conjunto de personas con una sola ocupación, hay mayor dispersión en los valores de ingresos.

8) Ocupación/Profesión vs. Ingresos¶

In [175]:
fig= px.histogram(df_1, 
                  x='CALIFICACION_OCUP', 
                  y='Ingresos',  
                  hover_name='Ingresos', 
                  histfunc='avg', text_auto='.2s', 
                  color_discrete_sequence=['#3A182E']
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 500,
    width = 600,
    title={
        'text': 'CALIFICACIÓN OCUPACIONAL VS. INGRESOS',
        },
    xaxis = {'categoryorder':'total descending'},
    yaxis_title = 'Ingresos en miles de pesos argentinos',
    xaxis_title = None,
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)
In [176]:
fig= px.histogram(df_1, 
                  x='TECNOLOGIA_OCUP', 
                  y='Ingresos',  
                  hover_name='Ingresos', 
                  histfunc='avg', text_auto='.2s', 
                  color_discrete_sequence=['#1e0454']
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 500,
    width = 600,
    title={
        'text': 'TECNOLOGÍA OCUPACIONAL VS. INGRESOS',
        },
    xaxis = {'categoryorder':'total descending'},
    yaxis_title = 'Ingresos en miles de pesos argentinos',
    xaxis_title = None,
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Recordemos cuál es la Hipótesis 8 planteada:

Sin tener en cuenta las categorías No sabe/No responde, podemos suponer que la hipótesis 8 es verdadera, es decir cuanto más calificada es la persona y más compleja la tecnología que emplea, mayores serán sus ingresos.

Por ejemplo, vemos que en promedio los profesionales ganan 62% más que los técnicos, y que quienes trabajan con computadoras ganan 76% más que quienes no emplean ningún tipo de máquina para trabajar.

Para profundizar el análisis, veremos cuál/cuáles son las ocupaciones con mayores ingresos, graficando la variable Carácter Ocupacional vs. Ingresos. También analizaremos la Jerarquía Ocupacional vs. Ingresos.

In [177]:
fig= px.histogram(df_1, 
                  y='CARACTER_OCUP', 
                  x='Ingresos',  
                  hover_name='Ingresos', 
                  histfunc='avg', text_auto='.2s'
                  
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 1000,
    width = 1000,
    title={
        'text': 'CARACTER OCUPACIONAL VS. INGRESOS',
        },
    yaxis = {'categoryorder':'total ascending'},
    xaxis_title = 'Ingresos en miles de pesos argentinos',
    yaxis_title = 'Código de carácter ocupacional',
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Las tres categorías con mayores ingresos promedio son:

  1. "02- Funcionarios del poder judicial, federal, nacional, provincial, municipal y/o departamental". Con un ingreso promedio de casi 300 mil pesos, supera en un 59% a la categoría ubicada en segundo lugar.

  2. "64- Ocupaciones de la producción pesquera". Con un ingreso promedio de casi $190 mil, supera solo en un 8% a la categoría ubicada en tercer lugar.

  3. "07- Directivos de grandes empresas privadas productoras de bienes y servicios". Ingreso promedio de 173 mil pesos.

Las tres categorías con menos ingresos son:

  • "58- Ocupaciones de los servicios sociales varios". Ingreso promedio 27 mil pesos.
  • "33- Ocupaciones de la comercialización ambulante y callejera". Ingreso promedio 22 mil pesos.
  • "55- Ocupaciones de los servicios domésticos". Ingreso promedio 19 mil pesos.

Comparando los extremos, un funcionario del poder judicial gana en promedio 16 veces más que una empleada doméstica.

In [178]:
fig = make_subplots(rows=1, cols=2,
                    subplot_titles=('Histograma', "Boxplot"))

fig.add_trace(
    go.Histogram( x=df_1['JERARQUIA_OCUP'], 
                  y=df_1['Ingresos'],
                  histfunc='avg' 
     ),
     row=1, col=1

)

fig.add_trace(
    go.Box( x=df_1['JERARQUIA_OCUP'], 
            y=df_1['Ingresos'],
            boxmean='sd', # represententa  el promedio y la desviación estándar
            boxpoints='suspectedoutliers', # sólo outliers sospechosos
            marker=dict(
            color='rgba(219, 64, 82, 0.6)',
            outliercolor='rgb(8,81,156)',
            line=dict(
            outliercolor='rgb(8,81,156)',
            outlierwidth=2)),
            line_color='rgb(8,81,156)'   
      ),
    row=1, col=2
)


fig.update_layout(height=600, width=1200, title_text="JERARQUÍA OCUPACIONAL VS. INGRESOS",
                  yaxis_title = 'Ingresos en miles de pesos argentinos',
                  xaxis_title = None, showlegend=False,
                  xaxis = {'categoryorder':'total descending'})
fig.show()

Esta variable es muy similar a la de Categoría Ocupacional. Los valores de ingresos medios dan muy similares a ésta, para las categorías de Asalariados y Cuenta propia.

Sin embargo, al comparar la categoría de Patrones con la de Jefes y Dirección, el ingreso promedio de los patrones es menor (alrededor de 83 mil pesos, contra el de Jefes y Directores, cuyo ingreso promedio es cercano a los 100k).

9) Características de la empresa/negocio vs. Ingresos¶

In [179]:
fig = make_subplots(rows=1, cols=2,
                    subplot_titles=('Histograma', "Boxplot"))

fig.add_trace(
    go.Histogram( x=df_1['Tipo_empr'], 
                  y=df_1['Ingresos'],
                  histfunc='avg' 
     ),
     row=1, col=1

)

fig.add_trace(
    go.Box( x=df_1['Tipo_empr'], 
            y=df_1['Ingresos'],
            boxmean='sd', # represententa  el promedio y la desviación estándar
            boxpoints='suspectedoutliers', # sólo outliers sospechosos
            marker=dict(
            color='rgba(219, 64, 82, 0.6)',
            outliercolor='rgb(8,81,156)',
            line=dict(
            outliercolor='rgb(8,81,156)',
            outlierwidth=2)),
            line_color='rgb(8,81,156)'   
      ),
    row=1, col=2
)


fig.update_layout(height=600, width=1200, title_text="ÁMBITO DE LA EMPRESA VS. INGRESOS",
                  yaxis_title = 'Ingresos en miles de pesos argentinos',
                  xaxis_title = None, showlegend=False,
                  xaxis = {'categoryorder':'total descending'})
fig.show()

Se observa que quienes trabajan en el ámbito estatal perciben un 31% más de ingresos que aquellos que se desempeñan en el ámbito privado. Con respecto a la categoría otros, podemos suponer que incluye otras organizaciones como por ejemplo ONGs. Cabe aclarar que, según lo visto en el análisis univariado, este grupo solo representa el 1,51% de todos los trabajadores del dataset.

In [180]:
fig = px.violin(df_1, 
                 x="Tamaño_empr_2",          
                 y= 'Ingresos', 
                 title= 'TAMAÑO DE LA EMPRESA VS. INGRESOS'
                 )
fig.update_layout(
    height = 500,
    width = 700,
    title={
        'xanchor': 'center',
        'yanchor': 'top',
        'y':0.97,
        'x':0.5,
        },
        yaxis_title = 'Ingresos en miles de pesos argentinos',
        xaxis_title = None,
        xaxis = {'categoryorder':'total descending'}
    )
fig.show()

Vemos que a mayor tamaño de la empresa, mayores son los ingresos percibidos por los trabajadores, siendo la brecha salarial promedio entre empresas grandes y microempresas igual a 218%.

In [181]:
fig= px.histogram(df_1, 
                  y='Lugar_trab', 
                  x='Ingresos',  
                  hover_name='Ingresos', 
                  histfunc='avg', text_auto='.2s'
                  
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 500,
    width = 700,
    title={
        'text': 'LUGAR FÍSICO DE TRABAJO VS. INGRESOS',
        },
    yaxis = {'categoryorder':'total ascending'},
    xaxis_title = 'Ingresos en miles de pesos argentinos',
    yaxis_title = 'Código de lugar de trabajo',
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Los tres lugares de trabajo que presentan mayores ingresos en promedio son:

  1. "4- En vehículo para transporte de personas y mercaderías- aéreos, marítimo, terrestre". Ingreso promedio: 68 mil pesos.
  2. "5- En obras en construcción, de infraestructura, minería o similares". Ingreso promedio: 64 mil pesos (6% menor al anterior).
  3. "1- En un local/oficina/establecimiento/negocio/taller/chacra/finca". Ingreso promedio: 63 mil pesos (9% menor a la primera catagoría).

Por su parte, quienes declaran trabajar en "7-Vivienda del socio o del patrón" presentan el menor ingreso, siendo que en promedio alguien de la categoría ubicada en primer lugar del gráfico gana 2,4 veces más.

In [182]:
fig= px.histogram(df_1, 
                  y='ACTIV_ECON', 
                  x='Ingresos', 
                  color='CAT_ECON', 
                  hover_name='Ingresos', 
                  histfunc='avg', text_auto='.2s',
                  color_discrete_sequence= px.colors.sequential.Plasma_r
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 1300,
    width = 900,
    title={
        'text': 'ACTIVIDAD ECONÓMICA DE LA EMPRESA VS. INGRESOS (Color según Categoría Económica)',
        },
    yaxis = {'categoryorder':'total ascending'},
    xaxis_title = 'Ingresos en miles de pesos argentinos',
    yaxis_title = 'Código de aglomerado',
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Las tres actividades con el ingreso promedio más alto (cercano a 130 mil pesos) son:

  1. "12- Elaboración de Productos de Tabaco", pertenecinete a la categoría "C- Industria Manufacturera".
  2. "51- Transporte Aéreo", de la categoría "H- Transporte y Almacenamiento".
  3. "60- Actividades de Programación y Difusión de Radio y Televisión", de la categoría "J- Información y Comunicación".

Las tres actividades con el ingreso promedio más bajo (menor a 30 mil pesos) son:

  • "81- Servicios de Apoyo a Edificios y Actividades de Limpieza en General; Servicios de Paisajismo y Jardinería", pertenecinete a la categoría "N- Actividades Administrativas y Servicios de Apoyo".
  • "88- Servicios Sociales sin Alojamiento", de la categoría "Q- Salud Humana y Servicios Sociales".
  • "97- Actividades de los Hogares como Empleadores de Personal Doméstico", de la categoría "T- Actividades de los Hogares como Empleadores de Personal Doméstico; Actividades de los Hogares como Productores de Bienes o Servicios para Uso Propio".

A continuación graficaremos las Categorías Económicas (recordemos que agrupan a las diferentes actividades económicas).

In [183]:
fig= px.histogram(df_1, 
                  y='CAT_ECON', 
                  x='Ingresos',  
                  hover_name='Ingresos', 
                  histfunc='avg', text_auto='.2s',
                  
      )
fig.update_traces(
    textposition = 'outside'
) #modficar todo lo relacionado con los trazos
fig.update_layout(
    height = 800,
    width = 800,
    title={
        'text': 'CATEGORÍA ECONÓMICA DE LA EMPRESA VS. INGRESOS',
        },
    yaxis = {'categoryorder':'total ascending'},
    xaxis_title = 'Ingresos en miles de pesos argentinos',
    yaxis_title = 'Código de categoría económica',
    plot_bgcolor='white',
    font=dict(
        family="Calibri",
        size=14,
        color="black")
)

Al agrupar las actividades por categorías, aparece como primera en cuanto al monto de ingreso promedio de los trabajadores, la categoría "K- Actividades Financieras y de Seguros".

En segundo lugar, con el ingreso promedio apenas 2% inferior a la primera, se ubica la categoría "J- Información y Comunicación".

En tercer lugar se ubica la categoría D, cuyas actividades están relacionadas a Suministro de Electricidad, Gas, Vapor y Aire Acondicionado, con un ingreso promedio 8% menor a la categoría K.

En último lugar, en coincidencia con el gráfico anterior, se encuentra la categoría T, que engloba las actividades relacionadas con el empleo doméstico.

Recordemos cuál es la Hipótesis 9 planteada:

Podemos decir que esta hipótesis parece cierta. En principio, perciben mayores ingresos quienes trabajan en el ámbito estatal, en empresas grandes y aquellos que tienen como lugar físico de trabajo un vehículo (terrestre/aéreo/marítimo).

No está tan claro el comportamiento de los ingresos en relación al tipo de actividad económica de la empresa, ya que no hay una actividad o categoría económica que destaque demasiado por sobre las demás.

Lo que sí se puede ver es que el rubro del empleo doméstico en todos los gráficos de ingresos se ubica en último lugar.

Insights del EDA¶

image.png

image.png

image.png

image.png

Machine Learning¶

Feature engineering¶

Esta etapa consiste en preparar las variables para introducirlas en nuestro modelo de Machine Learning. Repasaremos de qué tipo es cada una de las variables y qué transformación corresponde hacerle.

In [184]:
#Creamos una copia del dataset proveniente del Data Wrangling
df_1b=df_1a.copy()
In [185]:
df_1b.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 46241 entries, 0 to 49705
Data columns (total 25 columns):
 #   Column             Non-Null Count  Dtype  
---  ------             --------------  -----  
 0   REGION             46241 non-null  object 
 1   AGLOMERADO         46241 non-null  int64  
 2   MAS_500            46241 non-null  object 
 3   Sexo               46241 non-null  object 
 4   Edad               46241 non-null  int64  
 5   NIVEL_ED_2         46241 non-null  object 
 6   NIVEL_ED           46241 non-null  object 
 7   ESTADO             46241 non-null  int64  
 8   CAT_OCUP           46241 non-null  object 
 9   Cant_Ocup          46241 non-null  int64  
 10  Horas_sem          46241 non-null  float64
 11  INTENSI            46241 non-null  object 
 12  Tipo_empr          46241 non-null  object 
 13  Cod_activ          46241 non-null  int64  
 14  Tamaño_empr        46241 non-null  object 
 15  Cod_Ocup           46241 non-null  int64  
 16  Lugar_trab         46241 non-null  int64  
 17  Ingresos           46241 non-null  int64  
 18  Tamaño_empr_2      46241 non-null  object 
 19  CARACTER_OCUP      46241 non-null  object 
 20  JERARQUIA_OCUP     46241 non-null  object 
 21  TECNOLOGIA_OCUP    46241 non-null  object 
 22  CALIFICACION_OCUP  46241 non-null  object 
 23  ACTIV_ECON         46241 non-null  object 
 24  CAT_ECON           46241 non-null  object 
dtypes: float64(1), int64(8), object(16)
memory usage: 9.2+ MB

Clasificando nuevamente las variables que tenemos, en función de cómo ingresarlas al modelo:

Variables numéricas (Ingresan al modelo como están):

  • Edad
  • Horas_sem
  • Cant_Ocup
  • ESTADO
  • Aglomerado
  • Lugar_trab
  • Ingresos

Variables categóricas ordinales (Aplicaremos Label Encoding-LE):

  • Tipo_empr
  • CAT_OCUP
  • NIVEL_ED
  • NIVEL_ED_2
  • INTENSI
  • Tamaño_empr
  • Tamaño_empr_2
  • JERARQUIA_OCUP
  • TECNOLOGIA_OCUP
  • CALIFICACION_OCUP

Variables categóricas no ordinales (Aplicaremos OneHotEncoding- OHE):

  • Región
  • MAS_500
  • Sexo
  • CAT_ECON

Variables categóricas no ordinales (No aplicaremos OHE):

Para estas variables, dada la gran cantidad de categorías que tiene cada una, no aplicaremos OHE.

  • Cod_activ
  • Cod_Ocup
  • ACTIV_ECON (la transformaremos a tipo número entero)
  • CARACTER_OCUP (la transformaremos a tipo número entero)

LabelEncoding¶

Tipo de empresa¶

In [186]:
df_1b['Tipo_empr'].value_counts()
Out[186]:
0.0        28485
Privada    13116
Estatal     4368
Otro         272
Name: Tipo_empr, dtype: int64
In [187]:
#Aplicamos LE a la columna 
df_1b['Tipo_empr']=df_1b['Tipo_empr'].replace({'Otro':1,'Privada':2,'Estatal':3})
In [188]:
#Verificamos que los valores se hayan reemplazado correctamente
df_1b['Tipo_empr'].value_counts()
Out[188]:
0.0    28485
2.0    13116
3.0     4368
1.0      272
Name: Tipo_empr, dtype: int64

Categoría ocupacional¶

In [189]:
df_1b['CAT_OCUP'].value_counts()
Out[189]:
0                           27454
Obrero o empleado           14162
Cuenta Propia                3966
Patron                        547
Trabajador fliar s/remun      111
Ns/Nr                           1
Name: CAT_OCUP, dtype: int64
In [190]:
#Aplicamos LE a la columna 
df_1b['CAT_OCUP']=df_1b['CAT_OCUP'].replace({'Ns/Nr':0, 'Trabajador fliar s/remun':0, 'Cuenta Propia':1, 'Obrero o empleado':2,'Patron':3})
In [191]:
#Verificamos que los valores se hayan reemplazado correctamente
df_1b['CAT_OCUP'].value_counts()
Out[191]:
0    27566
2    14162
1     3966
3      547
Name: CAT_OCUP, dtype: int64

Nivel educativo 1¶

In [192]:
df_1b['NIVEL_ED'].value_counts()
Out[192]:
3-Secundario incompl            9808
4-Secundario compl              9166
1-Primario incompl              6880
6-Superior universit compl      5766
2-Primario compl                5542
5-Superior universit incompl    5450
0-Sin instrucción               3629
Name: NIVEL_ED, dtype: int64
In [193]:
#Aplicamos LE a la columna NIVEL_ED
df_1b['NIVEL_ED']=df_1b['NIVEL_ED'].replace({'0-Ns/Nr':0,'1-Primario incompl':1,'2-Primario compl':2,
                                           '3-Secundario incompl':3,'4-Secundario compl':4,'5-Superior universit incompl':5,
                                           '6-Superior universit compl':6,'0-Sin instrucción':0})
In [194]:
#Verificamos que los valores se hayan reemplazado correctamente
df_1b['NIVEL_ED'].value_counts()
Out[194]:
3    9808
4    9166
1    6880
6    5766
2    5542
5    5450
0    3629
Name: NIVEL_ED, dtype: int64

Nivel educativo 2¶

In [195]:
df_1b['NIVEL_ED_2'].value_counts()
Out[195]:
4-Secundario                      18022
2-Primario                        11659
7-Universitario                    6560
6-Terciario                        4383
1-Jardín/preescolar/Sin instr.     3629
3-EGB                               855
5-Polimodal                         586
0-Educación especial                274
8-Posgrado universitario            269
99                                    4
Name: NIVEL_ED_2, dtype: int64
In [196]:
#Aplicamos LE a la columna NIVEL_ED_2
df_1b['NIVEL_ED_2']=df_1b['NIVEL_ED_2'].replace({'0-Educación especial':0,'1-Jardín/preescolar/Sin instr.':1, '1-Jardín/preescolar/Sin instr.':1,'2-Primario':2,
                                           '3-EGB':3,'4-Secundario':4,'5-Polimodal':5,
                                           '6-Terciario':6,'7-Universitario':7,'8-Posgrado universitario':8, 99:0})
In [197]:
#Verificamos que los valores se hayan reemplazado correctamente
df_1b['NIVEL_ED_2'].value_counts()
Out[197]:
4    18022
2    11659
7     6560
6     4383
1     3629
3      855
5      586
0      278
8      269
Name: NIVEL_ED_2, dtype: int64

Intensidad de la ocupación¶

In [198]:
df_1b['INTENSI'].value_counts()
Out[198]:
0.0                            28485
2-Ocupado pleno                 9375
3-Sobreocupado                  4840
1-Subocupado                    1882
2-Ocup q no trabajó ult sem     1659
Name: INTENSI, dtype: int64
In [199]:
#Aplicamos LE a la columna INTENSI
df_1b['INTENSI']=df_1b['INTENSI'].replace({'1-Subocupado':1,'2-Ocupado pleno':2,'3-Sobreocupado':3,'2-Ocup q no trabajó ult sem':2})
In [200]:
#Convertimos la columna a tipo número entero
df_1b['INTENSI']= df_1b['INTENSI'].astype(int)
In [201]:
#Verificamos que los valores se hayan reemplazado correctamente
df_1b['INTENSI'].value_counts()
Out[201]:
0    28485
2    11034
3     4840
1     1882
Name: INTENSI, dtype: int64

Tamaño de la empresa 1¶

In [202]:
df_1b['Tamaño_empr'].value_counts()
Out[202]:
0) Ns/Nr               31709
1) 1 pers               2944
2) 2 pers               1591
9) 41 a 100 pers        1417
6) 6 a 10 pers          1379
7) 11 a 25 pers         1277
8) 26 a 40 pers         1254
3) 3 pers                967
10) 101 a 200 pers       958
12) Más de 500 pers      911
11) 201 a 500 pers       699
4) 4 pers                591
5) 5 pers                544
Name: Tamaño_empr, dtype: int64
In [203]:
#Aplicamos LE a la columna
df_1b['Tamaño_empr']=df_1b['Tamaño_empr'].replace({'1) 1 pers':1,'2) 2 pers':2,'3) 3 pers':3,'4) 4 pers':4,'5) 5 pers':5,
                                         '6) 6 a 10 pers':6, '7) 11 a 25 pers':7, '8) 26 a 40 pers':8, '9) 41 a 100 pers':9,
                                         '10) 101 a 200 pers':10, '11) 201 a 500 pers':11, '12) Más de 500 pers':12, '0) Ns/Nr':0, '0) Ns/Nr':0})
In [204]:
#Verificamos que los valores se hayan reemplazado correctamente
df_1b['Tamaño_empr'].value_counts()
Out[204]:
0     31709
1      2944
2      1591
9      1417
6      1379
7      1277
8      1254
3       967
10      958
12      911
11      699
4       591
5       544
Name: Tamaño_empr, dtype: int64

Tamaño de la empresa 2¶

In [205]:
df_1b['Tamaño_empr_2'].value_counts()
Out[205]:
0) Ns/Nr        31709
Microempresa     6637
Pequeña          3910
Mediana          2375
Grande           1610
Name: Tamaño_empr_2, dtype: int64
In [206]:
#Aplicamos LE a la columna
df_1b['Tamaño_empr_2']=df_1b['Tamaño_empr_2'].replace({'0) Ns/Nr':0,'Microempresa':1, 'Pequeña':2, 'Mediana':3, 'Grande':4})
In [207]:
#Verificamos que los valores se hayan reemplazado correctamente
df_1b['Tamaño_empr_2'].value_counts()
Out[207]:
0    31709
1     6637
2     3910
3     2375
4     1610
Name: Tamaño_empr_2, dtype: int64

Jerarquía ocupacional¶

In [208]:
df_1b['JERARQUIA_OCUP'].value_counts()
Out[208]:
0                28487
Asalariados      12831
Cuenta propia     3723
Direccion          736
Jefes              383
Ns/Nr               81
Name: JERARQUIA_OCUP, dtype: int64
In [209]:
#Aplicamos LE a la columna
df_1b['JERARQUIA_OCUP']=df_1b['JERARQUIA_OCUP'].replace({'Direccion':4,'Cuenta propia':1,'Jefes':3,'Asalariados':2,'Ns/Nr':0})
In [210]:
#Verificamos que los valores se hayan reemplazado correctamente
df_1b['JERARQUIA_OCUP'].value_counts()
Out[210]:
0    28568
2    12831
1     3723
4      736
3      383
Name: JERARQUIA_OCUP, dtype: int64

Tecnología ocupacional¶

In [211]:
df_1b['TECNOLOGIA_OCUP'].value_counts()
Out[211]:
0                           28487
Sin op. máq.                10400
Op. sist. informatizados     4612
Con op. máq.                 1542
Ns/Nr                        1200
Name: TECNOLOGIA_OCUP, dtype: int64
In [212]:
#Aplicamos LE a la columna
df_1b['TECNOLOGIA_OCUP']=df_1b['TECNOLOGIA_OCUP'].replace({'Ns/Nr':0,'Sin op. máq.':1,'Con op. máq.':2,'Op. sist. informatizados':3})
In [213]:
#Verificamos que los valores se hayan reemplazado correctamente
df_1b['TECNOLOGIA_OCUP'].value_counts()
Out[213]:
0    29687
1    10400
3     4612
2     1542
Name: TECNOLOGIA_OCUP, dtype: int64

Calificación ocupacional¶

In [214]:
df_1b['CALIFICACION_OCUP'].value_counts()
Out[214]:
0                28485
Operativo         9299
No calificado     3807
Técnicos          3175
Profesionales     1390
Ns/Nr               85
Name: CALIFICACION_OCUP, dtype: int64
In [215]:
#Aplicamos LE a la columna
df_1b['CALIFICACION_OCUP']=df_1b['CALIFICACION_OCUP'].replace({'Profesionales':4,'Técnicos':3,'Operativo':2,'No calificado':1, 'Ns/Nr':0, '0':0})
In [216]:
#Verificamos que los valores se hayan reemplazado correctamente
df_1b['CALIFICACION_OCUP'].value_counts()
Out[216]:
0    28570
2     9299
1     3807
3     3175
4     1390
Name: CALIFICACION_OCUP, dtype: int64

Matriz de correlación 2¶

Antes de aplicar OHE (que incrementa en gran número las columnas del dataset), queremos averiguar si aumentó la correlación entre las variables numéricas y la variable objetivo: 'Ingresos', con respecto a la Matriz de correlación 1.

In [217]:
df_1b.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 46241 entries, 0 to 49705
Data columns (total 25 columns):
 #   Column             Non-Null Count  Dtype  
---  ------             --------------  -----  
 0   REGION             46241 non-null  object 
 1   AGLOMERADO         46241 non-null  int64  
 2   MAS_500            46241 non-null  object 
 3   Sexo               46241 non-null  object 
 4   Edad               46241 non-null  int64  
 5   NIVEL_ED_2         46241 non-null  int64  
 6   NIVEL_ED           46241 non-null  int64  
 7   ESTADO             46241 non-null  int64  
 8   CAT_OCUP           46241 non-null  int64  
 9   Cant_Ocup          46241 non-null  int64  
 10  Horas_sem          46241 non-null  float64
 11  INTENSI            46241 non-null  int64  
 12  Tipo_empr          46241 non-null  float64
 13  Cod_activ          46241 non-null  int64  
 14  Tamaño_empr        46241 non-null  int64  
 15  Cod_Ocup           46241 non-null  int64  
 16  Lugar_trab         46241 non-null  int64  
 17  Ingresos           46241 non-null  int64  
 18  Tamaño_empr_2      46241 non-null  int64  
 19  CARACTER_OCUP      46241 non-null  object 
 20  JERARQUIA_OCUP     46241 non-null  int64  
 21  TECNOLOGIA_OCUP    46241 non-null  int64  
 22  CALIFICACION_OCUP  46241 non-null  int64  
 23  ACTIV_ECON         46241 non-null  object 
 24  CAT_ECON           46241 non-null  object 
dtypes: float64(2), int64(17), object(6)
memory usage: 9.2+ MB
In [218]:
df_1c=df_1b.copy()
In [219]:
df_1c.columns
Out[219]:
Index(['REGION', 'AGLOMERADO', 'MAS_500', 'Sexo', 'Edad', 'NIVEL_ED_2',
       'NIVEL_ED', 'ESTADO', 'CAT_OCUP', 'Cant_Ocup', 'Horas_sem', 'INTENSI',
       'Tipo_empr', 'Cod_activ', 'Tamaño_empr', 'Cod_Ocup', 'Lugar_trab',
       'Ingresos', 'Tamaño_empr_2', 'CARACTER_OCUP', 'JERARQUIA_OCUP',
       'TECNOLOGIA_OCUP', 'CALIFICACION_OCUP', 'ACTIV_ECON', 'CAT_ECON'],
      dtype='object')
In [220]:
df_1c.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 46241 entries, 0 to 49705
Data columns (total 25 columns):
 #   Column             Non-Null Count  Dtype  
---  ------             --------------  -----  
 0   REGION             46241 non-null  object 
 1   AGLOMERADO         46241 non-null  int64  
 2   MAS_500            46241 non-null  object 
 3   Sexo               46241 non-null  object 
 4   Edad               46241 non-null  int64  
 5   NIVEL_ED_2         46241 non-null  int64  
 6   NIVEL_ED           46241 non-null  int64  
 7   ESTADO             46241 non-null  int64  
 8   CAT_OCUP           46241 non-null  int64  
 9   Cant_Ocup          46241 non-null  int64  
 10  Horas_sem          46241 non-null  float64
 11  INTENSI            46241 non-null  int64  
 12  Tipo_empr          46241 non-null  float64
 13  Cod_activ          46241 non-null  int64  
 14  Tamaño_empr        46241 non-null  int64  
 15  Cod_Ocup           46241 non-null  int64  
 16  Lugar_trab         46241 non-null  int64  
 17  Ingresos           46241 non-null  int64  
 18  Tamaño_empr_2      46241 non-null  int64  
 19  CARACTER_OCUP      46241 non-null  object 
 20  JERARQUIA_OCUP     46241 non-null  int64  
 21  TECNOLOGIA_OCUP    46241 non-null  int64  
 22  CALIFICACION_OCUP  46241 non-null  int64  
 23  ACTIV_ECON         46241 non-null  object 
 24  CAT_ECON           46241 non-null  object 
dtypes: float64(2), int64(17), object(6)
memory usage: 9.2+ MB
In [221]:
#Analizaremos la correlación entre las variables numéricas
correlations2= df_1c.corr()
In [222]:
indx=correlations2.index

#Para visualizar la matriz de correlación
plt.figure(figsize=(16,12))
sns.heatmap(df_1c[indx].corr(),annot=True,cmap="YlGnBu")
Out[222]:
<AxesSubplot:>

Al inicio, en la Matriz de correlación 1, sin ninguna transformación de las columnas del dataframe, la variable que más relación tendría con los ingresos "P21" sería la categoría ocupacional "CAT_OCUP", con un coeficiente de correlación igual a 0,6.

Una vez realizada la limpieza y etiquetado de algunas columnas, vemos cómo han aumentado los valores de correlación, comparando las distitas variables con la variable Ingresos.

OneHotEncoding¶

In [223]:
df_1b.columns
Out[223]:
Index(['REGION', 'AGLOMERADO', 'MAS_500', 'Sexo', 'Edad', 'NIVEL_ED_2',
       'NIVEL_ED', 'ESTADO', 'CAT_OCUP', 'Cant_Ocup', 'Horas_sem', 'INTENSI',
       'Tipo_empr', 'Cod_activ', 'Tamaño_empr', 'Cod_Ocup', 'Lugar_trab',
       'Ingresos', 'Tamaño_empr_2', 'CARACTER_OCUP', 'JERARQUIA_OCUP',
       'TECNOLOGIA_OCUP', 'CALIFICACION_OCUP', 'ACTIV_ECON', 'CAT_ECON'],
      dtype='object')
In [224]:
#Variables a aplicar OnheHotEncoding 
OH_Encodear=['REGION','MAS_500','Sexo','CAT_ECON']
In [225]:
#Aplicamos OHE
df_1b = pd.get_dummies(df_1b, columns= OH_Encodear, drop_first=False)
In [226]:
df_1b.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 46241 entries, 0 to 49705
Data columns (total 52 columns):
 #   Column             Non-Null Count  Dtype  
---  ------             --------------  -----  
 0   AGLOMERADO         46241 non-null  int64  
 1   Edad               46241 non-null  int64  
 2   NIVEL_ED_2         46241 non-null  int64  
 3   NIVEL_ED           46241 non-null  int64  
 4   ESTADO             46241 non-null  int64  
 5   CAT_OCUP           46241 non-null  int64  
 6   Cant_Ocup          46241 non-null  int64  
 7   Horas_sem          46241 non-null  float64
 8   INTENSI            46241 non-null  int64  
 9   Tipo_empr          46241 non-null  float64
 10  Cod_activ          46241 non-null  int64  
 11  Tamaño_empr        46241 non-null  int64  
 12  Cod_Ocup           46241 non-null  int64  
 13  Lugar_trab         46241 non-null  int64  
 14  Ingresos           46241 non-null  int64  
 15  Tamaño_empr_2      46241 non-null  int64  
 16  CARACTER_OCUP      46241 non-null  object 
 17  JERARQUIA_OCUP     46241 non-null  int64  
 18  TECNOLOGIA_OCUP    46241 non-null  int64  
 19  CALIFICACION_OCUP  46241 non-null  int64  
 20  ACTIV_ECON         46241 non-null  object 
 21  REGION_CUYO        46241 non-null  uint8  
 22  REGION_GBA         46241 non-null  uint8  
 23  REGION_NEA         46241 non-null  uint8  
 24  REGION_NOA         46241 non-null  uint8  
 25  REGION_PAMPEANA    46241 non-null  uint8  
 26  REGION_PATAGONIA   46241 non-null  uint8  
 27  MAS_500_N          46241 non-null  uint8  
 28  MAS_500_S          46241 non-null  uint8  
 29  Sexo_F             46241 non-null  uint8  
 30  Sexo_M             46241 non-null  uint8  
 31  CAT_ECON_A         46241 non-null  uint8  
 32  CAT_ECON_C         46241 non-null  uint8  
 33  CAT_ECON_D         46241 non-null  uint8  
 34  CAT_ECON_E         46241 non-null  uint8  
 35  CAT_ECON_F         46241 non-null  uint8  
 36  CAT_ECON_G         46241 non-null  uint8  
 37  CAT_ECON_H         46241 non-null  uint8  
 38  CAT_ECON_I         46241 non-null  uint8  
 39  CAT_ECON_J         46241 non-null  uint8  
 40  CAT_ECON_K         46241 non-null  uint8  
 41  CAT_ECON_L         46241 non-null  uint8  
 42  CAT_ECON_M         46241 non-null  uint8  
 43  CAT_ECON_N         46241 non-null  uint8  
 44  CAT_ECON_O         46241 non-null  uint8  
 45  CAT_ECON_P         46241 non-null  uint8  
 46  CAT_ECON_Q         46241 non-null  uint8  
 47  CAT_ECON_R         46241 non-null  uint8  
 48  CAT_ECON_S         46241 non-null  uint8  
 49  CAT_ECON_T         46241 non-null  uint8  
 50  CAT_ECON_U         46241 non-null  uint8  
 51  CAT_ECON_W         46241 non-null  uint8  
dtypes: float64(2), int64(17), object(2), uint8(31)
memory usage: 9.1+ MB

Transformación de variables de tipo object¶

Como se mencionó más arriba, transformaremos a numéricas las columnas "CARACTER OCUPACIONAL" y "ACTIVIDAD ECONÓMICA". Cada una contiene códigos pero que se encuentran en formato string.

In [227]:
df_1b.describe(include=object)
Out[227]:
CARACTER_OCUP ACTIV_ECON
count 46241 46241
unique 52 78
top 0 0
freq 28487 28639
In [228]:
df_1b[['CARACTER_OCUP','ACTIV_ECON']]=df_1b[['CARACTER_OCUP','ACTIV_ECON']].astype(int)

Eliminación de outliers¶

Emplearemos la técnica de Isolation Forest.

In [229]:
isolation_forest = IsolationForest(contamination=0.15)
In [230]:
isolation_forest.fit(df_1b)
Out[230]:
IsolationForest(contamination=0.15)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
IsolationForest(contamination=0.15)
In [231]:
y_outlier = isolation_forest.predict(df_1b)
In [232]:
df_1b['is_outlier'] = y_outlier
In [233]:
df_1b.sample(10)
Out[233]:
AGLOMERADO Edad NIVEL_ED_2 NIVEL_ED ESTADO CAT_OCUP Cant_Ocup Horas_sem INTENSI Tipo_empr ... CAT_ECON_N CAT_ECON_O CAT_ECON_P CAT_ECON_Q CAT_ECON_R CAT_ECON_S CAT_ECON_T CAT_ECON_U CAT_ECON_W is_outlier
35528 13 15 3 3 0 0 0 0.0 0 0.0 ... 0 0 0 0 0 0 0 0 1 1
33656 26 18 7 5 0 0 0 0.0 0 0.0 ... 0 0 0 0 0 0 0 0 1 1
38003 31 58 4 4 1 1 1 40.0 2 2.0 ... 0 0 0 0 0 0 0 0 0 1
7170 17 73 2 1 0 0 0 0.0 0 0.0 ... 0 0 0 0 0 0 0 0 1 1
14153 15 43 7 6 1 2 1 40.0 2 2.0 ... 0 0 0 0 0 0 0 0 0 -1
28781 5 45 7 5 1 2 1 48.0 3 3.0 ... 0 1 0 0 0 0 0 0 0 -1
34087 5 64 4 4 0 0 0 0.0 0 0.0 ... 0 0 0 0 0 0 0 0 1 1
23111 22 34 4 4 1 1 1 36.0 2 2.0 ... 0 0 0 0 0 1 0 0 0 -1
4779 17 65 6 6 0 0 0 0.0 0 0.0 ... 0 0 0 0 0 0 0 0 1 1
7554 25 18 4 3 0 0 0 0.0 0 0.0 ... 0 0 0 0 0 0 0 0 1 1

10 rows × 53 columns

In [234]:
df_1b['is_outlier'].value_counts()
Out[234]:
 1    39305
-1     6936
Name: is_outlier, dtype: int64
In [235]:
outliers=  df_1b[df_1b.is_outlier == -1]
In [236]:
outliers
Out[236]:
AGLOMERADO Edad NIVEL_ED_2 NIVEL_ED ESTADO CAT_OCUP Cant_Ocup Horas_sem INTENSI Tipo_empr ... CAT_ECON_N CAT_ECON_O CAT_ECON_P CAT_ECON_Q CAT_ECON_R CAT_ECON_S CAT_ECON_T CAT_ECON_U CAT_ECON_W is_outlier
0 14 42 6 6 1 2 1 50.0 3 3.0 ... 0 1 0 0 0 0 0 0 0 -1
12 31 56 4 4 1 2 1 40.0 2 3.0 ... 0 1 0 0 0 0 0 0 0 -1
14 31 26 6 6 1 2 0 40.0 2 2.0 ... 0 0 0 0 0 0 0 0 0 -1
16 31 65 4 4 1 2 1 35.0 2 3.0 ... 0 0 0 0 0 0 0 0 0 -1
20 13 30 3 3 1 2 0 40.0 2 2.0 ... 0 0 0 0 0 0 0 0 0 -1
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
49673 93 35 6 5 1 2 1 35.0 2 3.0 ... 0 0 0 1 0 0 0 0 0 -1
49675 17 40 4 4 1 2 1 40.0 2 2.0 ... 0 0 0 0 0 1 0 0 0 -1
49693 32 42 8 6 1 1 1 40.0 2 2.0 ... 1 0 0 0 0 0 0 0 0 -1
49695 32 36 7 6 1 2 2 40.0 3 3.0 ... 0 0 0 0 0 0 0 0 0 -1
49704 2 34 6 6 1 2 1 72.0 3 3.0 ... 0 1 0 0 0 0 0 0 0 -1

6936 rows × 53 columns

Eliminaremos estos outliers que indica Isolation Forest.

In [237]:
df_sin_outliers = df_1b[df_1b.is_outlier == 1]

Cross-Validation: competencia de modelos¶

Haremos competir 5 modelos de regresión, mediante validación cruzada. Dividiremos el conjunto de datos de entrenamiento en 6 grupos o folds. Se seleccionará el que mejores predicciones realice, en función de las métricas de evaluación elegidas.

En primer lugar definimos las variables independientes "X" y la variable objetivo "y".

In [238]:
X = df_sin_outliers.drop(['Ingresos','is_outlier'], axis=1)
y = df_sin_outliers[['Ingresos']]

Split de la data.

In [239]:
# Realizamos la primera partición 
X_train_temp, X_test, y_train_temp, y_test = train_test_split(X, y, test_size=0.30, random_state=42)

# Realizamos la segunda partición
X_train, X_valid, y_train, y_valid = train_test_split(X_train_temp, y_train_temp, 
                                                      test_size=0.3, random_state=42)

Definiremos una función que compare los resultados del CV del conjunto de modelos a competir.

In [240]:
# Definimos una función que compare el desempeño de CV en el conjunto de modelos a competir
def cv_comparison(modelos, X, y, cv):
    # Creamos un DataFrame para los promedios y una lista para todas las métricas
    cv_metricas = pd.DataFrame()
    maes = []
    mses = []
    r2s = []
 
    # Hacemos un loop en los modelos, corremos CV, agregamos los puntajes promedio obtenidos al DataFrame 
    # y los puntajes de todos los CVs a la lista
    
    for modelo in modelos:
        mae = -np.round(cross_val_score(modelo, X, y, scoring='neg_mean_absolute_error', cv=cv), 6)
        maes.append(mae)
        mae_avg = round(mae.mean(), 0)
        mse = -np.round(cross_val_score(modelo, X, y, scoring='neg_mean_squared_error', cv=cv), 6)
        mses.append(mse)
        mse_avg = round(mse.mean(), 0)
        r2 = np.round(cross_val_score(modelo, X, y, scoring='r2', cv=cv), 6)
        r2s.append(r2)
        r2_avg = round(r2.mean(), 4)
        cv_metricas[str(modelo)] = [mae_avg, mse_avg, r2_avg]
    cv_metricas.index = ['Mean Absolute Error', 'Mean Squared Error', 'R^2']
    return cv_metricas, maes, mses, r2s

Creamos los modelos a ser evaluados.

In [241]:
#Asignamos una variable a cada modelo
RLM = LinearRegression()
GBR = GradientBoostingRegressor()
HGBR = HistGradientBoostingRegressor()
CBR = CatBoostRegressor()
XGBR = XGBRegressor()

#Ponemos los modelos es una lista que emplearemos en Cross-Validation
modelos=[RLM, GBR, HGBR, CBR, XGBR]

Aplicamos Cross-Validation a los modelos

In [ ]:
#Corremos la función arriba definida
cv_metricas, maes, mses, r2s = cv_comparison(modelos, X_train_temp, y_train_temp, 6)
In [251]:
cv_metricas.columns
Out[251]:
Index(['LinearRegression()', 'GradientBoostingRegressor()',
       'HistGradientBoostingRegressor()',
       '<catboost.core.CatBoostRegressor object at 0x7f9be4f98b50>',
       'XGBRegressor(base_score=None, booster=None, callbacks=None,\n             colsample_bylevel=None, colsample_bynode=None,\n             colsample_bytree=None, early_stopping_rounds=None,\n             enable_categorical=False, eval_metric=None, feature_types=None,\n             gamma=None, gpu_id=None, grow_policy=None, importance_type=None,\n             interaction_constraints=None, learning_rate=None, max_bin=None,\n             max_cat_threshold=None, max_cat_to_onehot=None,\n             max_delta_step=None, max_depth=None, max_leaves=None,\n             min_child_weight=None, missing=nan, monotone_constraints=None,\n             n_estimators=100, n_jobs=None, num_parallel_tree=None,\n             predictor=None, random_state=None, ...)'],
      dtype='object')
In [252]:
#Renombramos las columnas de la tabla de resultados
cv_metricas = cv_metricas.rename(columns={'LinearRegression()':'RLM','GradientBoostingRegressor()': 'GBR', 'HistGradientBoostingRegressor()':'HGBR', 
                                          '<catboost.core.CatBoostRegressor object at 0x7f9be4f98b50>':'CBR',
                                          'XGBRegressor(base_score=None, booster=None, callbacks=None,\n             colsample_bylevel=None, colsample_bynode=None,\n             colsample_bytree=None, early_stopping_rounds=None,\n             enable_categorical=False, eval_metric=None, feature_types=None,\n             gamma=None, gpu_id=None, grow_policy=None, importance_type=None,\n             interaction_constraints=None, learning_rate=None, max_bin=None,\n             max_cat_threshold=None, max_cat_to_onehot=None,\n             max_delta_step=None, max_depth=None, max_leaves=None,\n             min_child_weight=None, missing=nan, monotone_constraints=None,\n             n_estimators=100, n_jobs=None, num_parallel_tree=None,\n             predictor=None, random_state=None, ...)':'XGBR'})
In [253]:
cv_metricas.columns
Out[253]:
Index(['RLM', 'GBR', 'HGBR', 'CBR', 'XGBR'], dtype='object')
In [254]:
#Modificamos el formato de las columnas
cv_metricas['RLM'] = cv_metricas['RLM'].map('{:,.4f}'.format)
cv_metricas['GBR'] = cv_metricas['GBR'].map('{:,.4f}'.format)
cv_metricas['HGBR'] = cv_metricas['HGBR'].map('{:,.4f}'.format)
cv_metricas['CBR'] = cv_metricas['CBR'].map('{:,.4f}'.format)
cv_metricas['XGBR'] = cv_metricas['XGBR'].map('{:,.4f}'.format)

Tablas de resultados¶

In [255]:
cv_metricas
Out[255]:
RLM GBR HGBR CBR XGBR
Mean Absolute Error 5,874.0000 3,900.0000 3,714.0000 3,767.0000 3,895.0000
Mean Squared Error 144,315,692.0000 108,154,847.0000 102,280,094.0000 102,319,507.0000 109,374,447.0000
R^2 0.7637 0.8229 0.8330 0.8323 0.8206

Vemos que tanto el modelo HistGradientBoostingRegressor como CatBoostRegressor perfoman de manera similar. Por un lado, HGBR presenta el menor MAE promedio, mientras que CBR tiene mejor R2.

A continuación veremos el valor de estas métricas para cada uno de los folds del Cross-Validation.

Tabla de valores de R^2 en cada fold¶

In [256]:
# Creamos un DataFrame para todos los R2
r2_comp = pd.DataFrame(r2s, index=cv_metricas.columns, columns=['1er Fold', '2do Fold', '3er Fold', 
                                                         '4to Fold', '5to Fold','6to Fold'])

# Agregamos una columna para los promedios
r2_comp['Promedio'] = np.round(r2_comp.mean(axis=1),4)
In [257]:
r2_comp
Out[257]:
1er Fold 2do Fold 3er Fold 4to Fold 5to Fold 6to Fold Promedio
RLM 0.766674 0.738987 0.762353 0.772504 0.775893 0.765555 0.7637
GBR 0.825770 0.805149 0.825473 0.832264 0.834672 0.814192 0.8229
HGBR 0.832078 0.819088 0.833993 0.840244 0.846470 0.825866 0.8330
CBR 0.833862 0.819295 0.830218 0.839403 0.846525 0.824659 0.8323
XGBR 0.819112 0.810363 0.818531 0.828417 0.835951 0.811410 0.8206

De la tabla de valores de R^2, se observa que, para un mismo modelo, los valores de los distintos folds son muy similares entre sí, lo que habla de modelos robustos.

Tabla de valores de MAE en cada fold¶

In [258]:
# Creamos un DataFrame para todos los R2
MAE_comp = pd.DataFrame(maes, index=cv_metricas.columns, columns=['1er Fold', '2do Fold', '3er Fold', 
                                                         '4to Fold', '5to Fold','6to Fold'])

# Agregamos una columna para los promedios
MAE_comp['Promedio'] = np.round(MAE_comp.mean(axis=1),0)
In [259]:
MAE_comp
Out[259]:
1er Fold 2do Fold 3er Fold 4to Fold 5to Fold 6to Fold Promedio
RLM 5693.950188 6163.287951 5822.793999 5778.938773 5900.426440 5885.612813 5874.0
GBR 3664.065427 4194.394384 3793.326783 3751.081116 3955.241690 4044.364606 3900.0
HGBR 3474.104950 4000.653980 3690.380755 3582.712299 3711.584740 3822.063450 3714.0
CBR 3540.076473 4023.559776 3784.678796 3656.113523 3726.773502 3872.930444 3767.0
XGBR 3669.414008 4111.165616 3918.492832 3742.888293 3900.955136 4028.045184 3895.0

Para el caso de los valores de MAE se da una similar situación a lo que ocurre para los valores de R^2, dándonos la pauta de que el modelo es bueno.

Entrenamiento de los modelos HGBR y CBR¶

Ahora que ya conocemos los dos modelos que mejor performan, HistGradientBoostingRegressor y CatBoostRegressor, los entrenaremos mediante validación simple, para comparar sus predicciones con los datos reales del y_test.

Asignamos una variable a los modelos.

In [260]:
HGBR2 = HistGradientBoostingRegressor()
CBR2 = CatBoostRegressor()

Entrenamiento de los modelos.

In [261]:
# Recordemos la primera partición realizada
# X_train_temp, X_test, y_train_temp, y_test = train_test_split(X, y, test_size=0.30, random_state=42)
In [262]:
HGBR2.fit(X_train_temp,y_train_temp)
CBR2.fit(X_train_temp,y_train_temp)
Learning rate set to 0.069124
0:	learn: 23406.1624464	total: 6.38ms	remaining: 6.37s
1:	learn: 22254.4157636	total: 12.8ms	remaining: 6.38s
2:	learn: 21186.0682649	total: 18.7ms	remaining: 6.21s
3:	learn: 20197.6362721	total: 24.6ms	remaining: 6.12s
4:	learn: 19286.0163259	total: 30.8ms	remaining: 6.13s
5:	learn: 18461.8967618	total: 36.8ms	remaining: 6.1s
6:	learn: 17712.4327207	total: 43.6ms	remaining: 6.18s
7:	learn: 17028.1273485	total: 49.6ms	remaining: 6.15s
8:	learn: 16399.3299754	total: 55.3ms	remaining: 6.09s
9:	learn: 15839.2120907	total: 61.6ms	remaining: 6.09s
10:	learn: 15325.5739565	total: 68.6ms	remaining: 6.17s
11:	learn: 14844.4849862	total: 74.6ms	remaining: 6.14s
12:	learn: 14420.4950297	total: 80.7ms	remaining: 6.13s
13:	learn: 14019.6866191	total: 86.8ms	remaining: 6.11s
14:	learn: 13663.6081093	total: 93.7ms	remaining: 6.15s
15:	learn: 13344.9906245	total: 99.7ms	remaining: 6.13s
16:	learn: 13051.8578527	total: 106ms	remaining: 6.14s
17:	learn: 12793.6470152	total: 113ms	remaining: 6.16s
18:	learn: 12558.2174481	total: 119ms	remaining: 6.16s
19:	learn: 12339.1493375	total: 125ms	remaining: 6.14s
20:	learn: 12149.1093478	total: 132ms	remaining: 6.14s
21:	learn: 11980.9797707	total: 147ms	remaining: 6.53s
22:	learn: 11824.1746024	total: 156ms	remaining: 6.61s
23:	learn: 11686.1688609	total: 162ms	remaining: 6.59s
24:	learn: 11565.6695029	total: 169ms	remaining: 6.57s
25:	learn: 11451.1841137	total: 175ms	remaining: 6.55s
26:	learn: 11348.9465041	total: 181ms	remaining: 6.51s
27:	learn: 11260.1367055	total: 187ms	remaining: 6.5s
28:	learn: 11183.3142605	total: 198ms	remaining: 6.62s
29:	learn: 11102.4851051	total: 204ms	remaining: 6.59s
30:	learn: 11039.6811958	total: 210ms	remaining: 6.56s
31:	learn: 10974.6270790	total: 217ms	remaining: 6.56s
32:	learn: 10910.5900609	total: 223ms	remaining: 6.54s
33:	learn: 10858.2133753	total: 228ms	remaining: 6.49s
34:	learn: 10808.4693178	total: 235ms	remaining: 6.49s
35:	learn: 10759.0368970	total: 241ms	remaining: 6.46s
36:	learn: 10717.8359000	total: 248ms	remaining: 6.45s
37:	learn: 10677.2566215	total: 253ms	remaining: 6.41s
38:	learn: 10642.1420358	total: 260ms	remaining: 6.4s
39:	learn: 10605.1642477	total: 266ms	remaining: 6.39s
40:	learn: 10572.3003681	total: 272ms	remaining: 6.37s
41:	learn: 10540.4037836	total: 278ms	remaining: 6.34s
42:	learn: 10514.0961296	total: 285ms	remaining: 6.34s
43:	learn: 10486.2567367	total: 291ms	remaining: 6.33s
44:	learn: 10460.4067833	total: 297ms	remaining: 6.31s
45:	learn: 10435.7038929	total: 304ms	remaining: 6.3s
46:	learn: 10414.0191932	total: 310ms	remaining: 6.28s
47:	learn: 10394.1540462	total: 316ms	remaining: 6.27s
48:	learn: 10370.8294542	total: 323ms	remaining: 6.26s
49:	learn: 10353.4002254	total: 328ms	remaining: 6.23s
50:	learn: 10333.6422792	total: 333ms	remaining: 6.2s
51:	learn: 10314.5439570	total: 340ms	remaining: 6.19s
52:	learn: 10297.8835788	total: 346ms	remaining: 6.19s
53:	learn: 10274.2875625	total: 353ms	remaining: 6.17s
54:	learn: 10257.9436056	total: 358ms	remaining: 6.16s
55:	learn: 10242.0829657	total: 364ms	remaining: 6.13s
56:	learn: 10228.3011053	total: 369ms	remaining: 6.11s
57:	learn: 10211.1145271	total: 375ms	remaining: 6.09s
58:	learn: 10198.4711789	total: 382ms	remaining: 6.1s
59:	learn: 10179.8626226	total: 388ms	remaining: 6.08s
60:	learn: 10167.0419487	total: 398ms	remaining: 6.13s
61:	learn: 10153.1940820	total: 405ms	remaining: 6.12s
62:	learn: 10140.5834538	total: 412ms	remaining: 6.12s
63:	learn: 10124.3201862	total: 420ms	remaining: 6.14s
64:	learn: 10114.0186856	total: 427ms	remaining: 6.15s
65:	learn: 10104.1186293	total: 433ms	remaining: 6.13s
66:	learn: 10090.2830100	total: 439ms	remaining: 6.11s
67:	learn: 10075.9451249	total: 445ms	remaining: 6.1s
68:	learn: 10066.4332883	total: 451ms	remaining: 6.08s
69:	learn: 10055.8546211	total: 458ms	remaining: 6.09s
70:	learn: 10044.6951586	total: 464ms	remaining: 6.07s
71:	learn: 10035.0785254	total: 470ms	remaining: 6.05s
72:	learn: 10025.6840250	total: 475ms	remaining: 6.03s
73:	learn: 10016.1630100	total: 481ms	remaining: 6.01s
74:	learn: 10002.9124632	total: 486ms	remaining: 6s
75:	learn: 9994.0396785	total: 492ms	remaining: 5.98s
76:	learn: 9986.7548135	total: 507ms	remaining: 6.08s
77:	learn: 9980.2468914	total: 518ms	remaining: 6.12s
78:	learn: 9971.7033942	total: 523ms	remaining: 6.1s
79:	learn: 9962.2942671	total: 528ms	remaining: 6.08s
80:	learn: 9955.7672141	total: 536ms	remaining: 6.08s
81:	learn: 9948.4160535	total: 544ms	remaining: 6.09s
82:	learn: 9938.7081692	total: 550ms	remaining: 6.08s
83:	learn: 9929.3152358	total: 557ms	remaining: 6.07s
84:	learn: 9921.3301577	total: 563ms	remaining: 6.06s
85:	learn: 9915.3765749	total: 568ms	remaining: 6.04s
86:	learn: 9901.9548448	total: 574ms	remaining: 6.03s
87:	learn: 9895.1660153	total: 580ms	remaining: 6.01s
88:	learn: 9887.5134608	total: 585ms	remaining: 5.99s
89:	learn: 9880.1241019	total: 596ms	remaining: 6.02s
90:	learn: 9871.2689656	total: 602ms	remaining: 6.02s
91:	learn: 9859.0284446	total: 609ms	remaining: 6.01s
92:	learn: 9846.8733111	total: 615ms	remaining: 6s
93:	learn: 9839.1547799	total: 621ms	remaining: 5.99s
94:	learn: 9833.2813510	total: 626ms	remaining: 5.97s
95:	learn: 9828.4164222	total: 632ms	remaining: 5.95s
96:	learn: 9824.6574052	total: 637ms	remaining: 5.93s
97:	learn: 9815.9197999	total: 643ms	remaining: 5.92s
98:	learn: 9807.8814304	total: 649ms	remaining: 5.91s
99:	learn: 9800.6954702	total: 654ms	remaining: 5.89s
100:	learn: 9794.5621799	total: 660ms	remaining: 5.88s
101:	learn: 9787.0068146	total: 667ms	remaining: 5.87s
102:	learn: 9776.8143073	total: 672ms	remaining: 5.86s
103:	learn: 9769.8569716	total: 678ms	remaining: 5.84s
104:	learn: 9761.3243784	total: 685ms	remaining: 5.84s
105:	learn: 9756.5512902	total: 690ms	remaining: 5.82s
106:	learn: 9750.1023823	total: 695ms	remaining: 5.8s
107:	learn: 9743.3256112	total: 701ms	remaining: 5.79s
108:	learn: 9734.2301943	total: 707ms	remaining: 5.78s
109:	learn: 9726.6092347	total: 713ms	remaining: 5.77s
110:	learn: 9721.0998381	total: 718ms	remaining: 5.75s
111:	learn: 9713.0653135	total: 724ms	remaining: 5.74s
112:	learn: 9704.7180562	total: 730ms	remaining: 5.73s
113:	learn: 9699.0416960	total: 735ms	remaining: 5.71s
114:	learn: 9693.0712158	total: 741ms	remaining: 5.71s
115:	learn: 9687.9827166	total: 747ms	remaining: 5.69s
116:	learn: 9683.1838673	total: 753ms	remaining: 5.69s
117:	learn: 9676.3273433	total: 760ms	remaining: 5.68s
118:	learn: 9669.9487867	total: 765ms	remaining: 5.67s
119:	learn: 9663.0431867	total: 771ms	remaining: 5.65s
120:	learn: 9655.6286444	total: 777ms	remaining: 5.64s
121:	learn: 9650.2012350	total: 784ms	remaining: 5.64s
122:	learn: 9645.0987293	total: 792ms	remaining: 5.65s
123:	learn: 9638.0490378	total: 799ms	remaining: 5.65s
124:	learn: 9631.5413094	total: 805ms	remaining: 5.63s
125:	learn: 9627.1418956	total: 811ms	remaining: 5.63s
126:	learn: 9619.8513666	total: 817ms	remaining: 5.61s
127:	learn: 9615.8020429	total: 822ms	remaining: 5.6s
128:	learn: 9611.3171848	total: 828ms	remaining: 5.59s
129:	learn: 9604.0210862	total: 833ms	remaining: 5.58s
130:	learn: 9598.3162979	total: 839ms	remaining: 5.57s
131:	learn: 9591.6837118	total: 845ms	remaining: 5.55s
132:	learn: 9586.4186526	total: 850ms	remaining: 5.54s
133:	learn: 9577.6221964	total: 865ms	remaining: 5.59s
134:	learn: 9571.6522113	total: 871ms	remaining: 5.58s
135:	learn: 9565.0674480	total: 877ms	remaining: 5.57s
136:	learn: 9560.3377264	total: 882ms	remaining: 5.56s
137:	learn: 9553.6000977	total: 888ms	remaining: 5.55s
138:	learn: 9546.4426239	total: 895ms	remaining: 5.54s
139:	learn: 9539.8909233	total: 900ms	remaining: 5.53s
140:	learn: 9531.7131495	total: 907ms	remaining: 5.52s
141:	learn: 9526.8327260	total: 912ms	remaining: 5.51s
142:	learn: 9522.3810411	total: 919ms	remaining: 5.5s
143:	learn: 9517.6392308	total: 925ms	remaining: 5.5s
144:	learn: 9513.6891452	total: 931ms	remaining: 5.49s
145:	learn: 9509.0689563	total: 936ms	remaining: 5.48s
146:	learn: 9501.4061738	total: 942ms	remaining: 5.47s
147:	learn: 9492.4513153	total: 948ms	remaining: 5.46s
148:	learn: 9486.1599123	total: 954ms	remaining: 5.45s
149:	learn: 9481.2033835	total: 960ms	remaining: 5.44s
150:	learn: 9472.7563856	total: 966ms	remaining: 5.43s
151:	learn: 9467.8805751	total: 972ms	remaining: 5.42s
152:	learn: 9461.0315154	total: 977ms	remaining: 5.41s
153:	learn: 9455.1235804	total: 983ms	remaining: 5.4s
154:	learn: 9445.2080922	total: 1s	remaining: 5.46s
155:	learn: 9440.5528620	total: 1.01s	remaining: 5.45s
156:	learn: 9435.1924304	total: 1.01s	remaining: 5.44s
157:	learn: 9428.6515023	total: 1.02s	remaining: 5.43s
158:	learn: 9423.2563505	total: 1.02s	remaining: 5.42s
159:	learn: 9419.1765813	total: 1.03s	remaining: 5.42s
160:	learn: 9412.1223307	total: 1.04s	remaining: 5.41s
161:	learn: 9404.5568219	total: 1.04s	remaining: 5.4s
162:	learn: 9400.1366225	total: 1.05s	remaining: 5.39s
163:	learn: 9393.8320822	total: 1.05s	remaining: 5.38s
164:	learn: 9387.3592907	total: 1.06s	remaining: 5.37s
165:	learn: 9383.0933018	total: 1.07s	remaining: 5.36s
166:	learn: 9376.1773281	total: 1.07s	remaining: 5.35s
167:	learn: 9370.5762578	total: 1.08s	remaining: 5.34s
168:	learn: 9366.7601252	total: 1.08s	remaining: 5.33s
169:	learn: 9360.9582021	total: 1.09s	remaining: 5.32s
170:	learn: 9357.5186846	total: 1.1s	remaining: 5.31s
171:	learn: 9353.6130058	total: 1.1s	remaining: 5.31s
172:	learn: 9347.4177685	total: 1.11s	remaining: 5.3s
173:	learn: 9341.0436771	total: 1.11s	remaining: 5.29s
174:	learn: 9336.2603464	total: 1.12s	remaining: 5.29s
175:	learn: 9332.9921252	total: 1.13s	remaining: 5.28s
176:	learn: 9328.1942604	total: 1.13s	remaining: 5.27s
177:	learn: 9322.6246958	total: 1.14s	remaining: 5.26s
178:	learn: 9316.7930870	total: 1.14s	remaining: 5.25s
179:	learn: 9310.9216162	total: 1.15s	remaining: 5.24s
180:	learn: 9305.9511842	total: 1.16s	remaining: 5.23s
181:	learn: 9301.2182439	total: 1.16s	remaining: 5.22s
182:	learn: 9295.9005459	total: 1.17s	remaining: 5.22s
183:	learn: 9291.8896684	total: 1.17s	remaining: 5.21s
184:	learn: 9287.6857953	total: 1.18s	remaining: 5.2s
185:	learn: 9283.8637509	total: 1.19s	remaining: 5.19s
186:	learn: 9277.7169881	total: 1.19s	remaining: 5.18s
187:	learn: 9272.4234459	total: 1.2s	remaining: 5.2s
188:	learn: 9268.9129266	total: 1.21s	remaining: 5.2s
189:	learn: 9265.9711456	total: 1.22s	remaining: 5.19s
190:	learn: 9260.7819592	total: 1.22s	remaining: 5.18s
191:	learn: 9256.5019227	total: 1.23s	remaining: 5.17s
192:	learn: 9251.7316934	total: 1.23s	remaining: 5.16s
193:	learn: 9248.6499712	total: 1.24s	remaining: 5.16s
194:	learn: 9241.1011298	total: 1.25s	remaining: 5.15s
195:	learn: 9236.6117831	total: 1.25s	remaining: 5.14s
196:	learn: 9229.9256085	total: 1.26s	remaining: 5.13s
197:	learn: 9221.5633299	total: 1.26s	remaining: 5.12s
198:	learn: 9213.4531086	total: 1.27s	remaining: 5.11s
199:	learn: 9209.1192068	total: 1.28s	remaining: 5.11s
200:	learn: 9204.6958713	total: 1.28s	remaining: 5.1s
201:	learn: 9199.8955996	total: 1.29s	remaining: 5.09s
202:	learn: 9196.1932638	total: 1.29s	remaining: 5.09s
203:	learn: 9191.0408661	total: 1.3s	remaining: 5.08s
204:	learn: 9183.8728865	total: 1.31s	remaining: 5.07s
205:	learn: 9181.1722706	total: 1.31s	remaining: 5.06s
206:	learn: 9175.9712364	total: 1.32s	remaining: 5.05s
207:	learn: 9171.7920954	total: 1.32s	remaining: 5.04s
208:	learn: 9169.6914894	total: 1.33s	remaining: 5.04s
209:	learn: 9166.6605952	total: 1.33s	remaining: 5.03s
210:	learn: 9162.7312845	total: 1.34s	remaining: 5.02s
211:	learn: 9157.3744381	total: 1.35s	remaining: 5.01s
212:	learn: 9154.3372867	total: 1.35s	remaining: 5s
213:	learn: 9150.9194930	total: 1.36s	remaining: 4.99s
214:	learn: 9145.8748828	total: 1.36s	remaining: 4.98s
215:	learn: 9139.1873311	total: 1.37s	remaining: 4.99s
216:	learn: 9135.3429505	total: 1.38s	remaining: 4.99s
217:	learn: 9129.1447620	total: 1.39s	remaining: 4.98s
218:	learn: 9123.9871123	total: 1.39s	remaining: 4.97s
219:	learn: 9120.0939617	total: 1.4s	remaining: 4.98s
220:	learn: 9116.8384475	total: 1.41s	remaining: 4.97s
221:	learn: 9110.8756821	total: 1.42s	remaining: 4.96s
222:	learn: 9108.4114638	total: 1.42s	remaining: 4.95s
223:	learn: 9103.2067224	total: 1.43s	remaining: 4.94s
224:	learn: 9099.6958855	total: 1.43s	remaining: 4.93s
225:	learn: 9097.6219212	total: 1.44s	remaining: 4.92s
226:	learn: 9094.0307757	total: 1.44s	remaining: 4.92s
227:	learn: 9089.5170823	total: 1.45s	remaining: 4.91s
228:	learn: 9084.7617491	total: 1.46s	remaining: 4.9s
229:	learn: 9080.0443536	total: 1.46s	remaining: 4.89s
230:	learn: 9074.5713756	total: 1.47s	remaining: 4.88s
231:	learn: 9069.3190741	total: 1.48s	remaining: 4.89s
232:	learn: 9067.5497018	total: 1.49s	remaining: 4.9s
233:	learn: 9062.9989422	total: 1.5s	remaining: 4.93s
234:	learn: 9058.0967685	total: 1.51s	remaining: 4.92s
235:	learn: 9052.8953826	total: 1.52s	remaining: 4.91s
236:	learn: 9049.6281315	total: 1.52s	remaining: 4.9s
237:	learn: 9045.8972487	total: 1.53s	remaining: 4.89s
238:	learn: 9040.9742952	total: 1.53s	remaining: 4.88s
239:	learn: 9038.9276852	total: 1.54s	remaining: 4.87s
240:	learn: 9033.6603476	total: 1.54s	remaining: 4.87s
241:	learn: 9027.7070612	total: 1.55s	remaining: 4.86s
242:	learn: 9023.6909492	total: 1.55s	remaining: 4.84s
243:	learn: 9019.2975774	total: 1.56s	remaining: 4.83s
244:	learn: 9016.3421674	total: 1.57s	remaining: 4.83s
245:	learn: 9015.7203741	total: 1.57s	remaining: 4.82s
246:	learn: 9012.4048117	total: 1.58s	remaining: 4.81s
247:	learn: 9011.5633777	total: 1.58s	remaining: 4.8s
248:	learn: 9007.6806966	total: 1.59s	remaining: 4.79s
249:	learn: 9006.3681103	total: 1.59s	remaining: 4.78s
250:	learn: 9002.7276480	total: 1.6s	remaining: 4.78s
251:	learn: 8999.6832611	total: 1.61s	remaining: 4.78s
252:	learn: 8996.6571502	total: 1.62s	remaining: 4.77s
253:	learn: 8991.5907673	total: 1.62s	remaining: 4.76s
254:	learn: 8987.2194883	total: 1.63s	remaining: 4.76s
255:	learn: 8982.8765595	total: 1.63s	remaining: 4.75s
256:	learn: 8982.3861408	total: 1.64s	remaining: 4.74s
257:	learn: 8975.3224393	total: 1.64s	remaining: 4.73s
258:	learn: 8971.4489588	total: 1.65s	remaining: 4.72s
259:	learn: 8968.1577316	total: 1.66s	remaining: 4.71s
260:	learn: 8967.6872650	total: 1.66s	remaining: 4.7s
261:	learn: 8961.8568418	total: 1.67s	remaining: 4.7s
262:	learn: 8956.5123262	total: 1.67s	remaining: 4.69s
263:	learn: 8953.2382120	total: 1.68s	remaining: 4.68s
264:	learn: 8948.3060822	total: 1.69s	remaining: 4.67s
265:	learn: 8944.6515824	total: 1.69s	remaining: 4.67s
266:	learn: 8940.0682912	total: 1.7s	remaining: 4.66s
267:	learn: 8935.0203229	total: 1.7s	remaining: 4.65s
268:	learn: 8934.5037361	total: 1.71s	remaining: 4.64s
269:	learn: 8931.4307586	total: 1.71s	remaining: 4.63s
270:	learn: 8930.6885292	total: 1.72s	remaining: 4.63s
271:	learn: 8927.7091052	total: 1.73s	remaining: 4.63s
272:	learn: 8927.3419624	total: 1.74s	remaining: 4.64s
273:	learn: 8924.3532031	total: 1.75s	remaining: 4.63s
274:	learn: 8918.0257209	total: 1.75s	remaining: 4.62s
275:	learn: 8917.3390896	total: 1.76s	remaining: 4.61s
276:	learn: 8916.8984224	total: 1.76s	remaining: 4.61s
277:	learn: 8912.8090994	total: 1.77s	remaining: 4.6s
278:	learn: 8909.1160220	total: 1.77s	remaining: 4.59s
279:	learn: 8905.7090875	total: 1.78s	remaining: 4.58s
280:	learn: 8901.5369115	total: 1.79s	remaining: 4.57s
281:	learn: 8898.7617035	total: 1.79s	remaining: 4.57s
282:	learn: 8897.7665352	total: 1.8s	remaining: 4.57s
283:	learn: 8893.5169022	total: 1.81s	remaining: 4.56s
284:	learn: 8890.1574427	total: 1.82s	remaining: 4.56s
285:	learn: 8889.8790146	total: 1.82s	remaining: 4.55s
286:	learn: 8885.8304872	total: 1.83s	remaining: 4.54s
287:	learn: 8882.0621322	total: 1.83s	remaining: 4.53s
288:	learn: 8878.0530501	total: 1.84s	remaining: 4.53s
289:	learn: 8873.2723870	total: 1.85s	remaining: 4.52s
290:	learn: 8871.3091215	total: 1.85s	remaining: 4.51s
291:	learn: 8870.9249029	total: 1.86s	remaining: 4.51s
292:	learn: 8866.9209530	total: 1.86s	remaining: 4.5s
293:	learn: 8866.5528916	total: 1.87s	remaining: 4.49s
294:	learn: 8864.3313917	total: 1.88s	remaining: 4.48s
295:	learn: 8861.1173887	total: 1.88s	remaining: 4.48s
296:	learn: 8857.3502289	total: 1.89s	remaining: 4.47s
297:	learn: 8854.3787819	total: 1.89s	remaining: 4.46s
298:	learn: 8853.6204320	total: 1.9s	remaining: 4.45s
299:	learn: 8850.5998522	total: 1.91s	remaining: 4.45s
300:	learn: 8845.8755948	total: 1.91s	remaining: 4.43s
301:	learn: 8842.8637712	total: 1.92s	remaining: 4.43s
302:	learn: 8842.5080728	total: 1.92s	remaining: 4.42s
303:	learn: 8841.4230570	total: 1.93s	remaining: 4.41s
304:	learn: 8838.0839254	total: 1.93s	remaining: 4.4s
305:	learn: 8834.9910966	total: 1.94s	remaining: 4.39s
306:	learn: 8831.5969526	total: 1.94s	remaining: 4.38s
307:	learn: 8829.8534343	total: 1.95s	remaining: 4.38s
308:	learn: 8829.5168888	total: 1.95s	remaining: 4.37s
309:	learn: 8825.5778247	total: 1.96s	remaining: 4.36s
310:	learn: 8823.0651336	total: 1.96s	remaining: 4.35s
311:	learn: 8819.9994778	total: 1.97s	remaining: 4.34s
312:	learn: 8818.0459621	total: 1.98s	remaining: 4.33s
313:	learn: 8815.4466436	total: 1.98s	remaining: 4.33s
314:	learn: 8814.4707222	total: 1.99s	remaining: 4.32s
315:	learn: 8809.0315316	total: 1.99s	remaining: 4.31s
316:	learn: 8806.2274006	total: 2s	remaining: 4.31s
317:	learn: 8803.3668781	total: 2.02s	remaining: 4.32s
318:	learn: 8802.2491530	total: 2.02s	remaining: 4.31s
319:	learn: 8801.9498941	total: 2.02s	remaining: 4.3s
320:	learn: 8797.5260244	total: 2.03s	remaining: 4.3s
321:	learn: 8792.8783365	total: 2.04s	remaining: 4.29s
322:	learn: 8791.9537368	total: 2.04s	remaining: 4.28s
323:	learn: 8791.0725372	total: 2.05s	remaining: 4.27s
324:	learn: 8787.4615635	total: 2.05s	remaining: 4.26s
325:	learn: 8784.9980631	total: 2.06s	remaining: 4.25s
326:	learn: 8781.0342218	total: 2.06s	remaining: 4.25s
327:	learn: 8778.2105662	total: 2.07s	remaining: 4.24s
328:	learn: 8775.4820551	total: 2.08s	remaining: 4.24s
329:	learn: 8772.0587637	total: 2.08s	remaining: 4.23s
330:	learn: 8770.3257440	total: 2.09s	remaining: 4.22s
331:	learn: 8766.6978158	total: 2.09s	remaining: 4.21s
332:	learn: 8759.4835071	total: 2.1s	remaining: 4.2s
333:	learn: 8756.9062293	total: 2.1s	remaining: 4.2s
334:	learn: 8753.8908819	total: 2.11s	remaining: 4.19s
335:	learn: 8751.2584199	total: 2.12s	remaining: 4.18s
336:	learn: 8748.8696699	total: 2.12s	remaining: 4.17s
337:	learn: 8744.9398258	total: 2.13s	remaining: 4.17s
338:	learn: 8741.5547501	total: 2.13s	remaining: 4.16s
339:	learn: 8738.7621218	total: 2.14s	remaining: 4.16s
340:	learn: 8735.7282999	total: 2.15s	remaining: 4.15s
341:	learn: 8733.4189609	total: 2.16s	remaining: 4.16s
342:	learn: 8730.3129094	total: 2.17s	remaining: 4.15s
343:	learn: 8728.3239810	total: 2.17s	remaining: 4.14s
344:	learn: 8726.7057882	total: 2.17s	remaining: 4.13s
345:	learn: 8723.5858650	total: 2.18s	remaining: 4.12s
346:	learn: 8716.5544987	total: 2.19s	remaining: 4.12s
347:	learn: 8712.7059884	total: 2.2s	remaining: 4.12s
348:	learn: 8709.1233368	total: 2.2s	remaining: 4.11s
349:	learn: 8707.4436934	total: 2.21s	remaining: 4.1s
350:	learn: 8704.5730182	total: 2.21s	remaining: 4.1s
351:	learn: 8701.6007056	total: 2.22s	remaining: 4.09s
352:	learn: 8697.3549093	total: 2.23s	remaining: 4.08s
353:	learn: 8693.5389702	total: 2.23s	remaining: 4.08s
354:	learn: 8691.8135394	total: 2.24s	remaining: 4.07s
355:	learn: 8687.9280211	total: 2.25s	remaining: 4.06s
356:	learn: 8685.0964746	total: 2.25s	remaining: 4.05s
357:	learn: 8682.5849605	total: 2.26s	remaining: 4.05s
358:	learn: 8682.0390508	total: 2.26s	remaining: 4.04s
359:	learn: 8679.7784871	total: 2.27s	remaining: 4.03s
360:	learn: 8676.4427054	total: 2.27s	remaining: 4.02s
361:	learn: 8672.5131089	total: 2.28s	remaining: 4.02s
362:	learn: 8671.7719855	total: 2.28s	remaining: 4.01s
363:	learn: 8669.7197470	total: 2.29s	remaining: 4s
364:	learn: 8667.6413392	total: 2.29s	remaining: 3.99s
365:	learn: 8666.8981958	total: 2.3s	remaining: 3.98s
366:	learn: 8663.1015797	total: 2.31s	remaining: 3.98s
367:	learn: 8660.1115781	total: 2.31s	remaining: 3.97s
368:	learn: 8656.9962726	total: 2.32s	remaining: 3.96s
369:	learn: 8653.5693689	total: 2.32s	remaining: 3.96s
370:	learn: 8652.0181540	total: 2.33s	remaining: 3.95s
371:	learn: 8644.4329376	total: 2.33s	remaining: 3.94s
372:	learn: 8643.0413237	total: 2.34s	remaining: 3.93s
373:	learn: 8640.0658596	total: 2.35s	remaining: 3.93s
374:	learn: 8636.0044939	total: 2.35s	remaining: 3.92s
375:	learn: 8633.1590922	total: 2.36s	remaining: 3.91s
376:	learn: 8631.1728857	total: 2.36s	remaining: 3.9s
377:	learn: 8627.4818125	total: 2.37s	remaining: 3.9s
378:	learn: 8624.1329498	total: 2.37s	remaining: 3.89s
379:	learn: 8618.6830745	total: 2.38s	remaining: 3.88s
380:	learn: 8617.2670202	total: 2.38s	remaining: 3.88s
381:	learn: 8613.8744650	total: 2.39s	remaining: 3.87s
382:	learn: 8610.8492208	total: 2.4s	remaining: 3.87s
383:	learn: 8607.8755912	total: 2.41s	remaining: 3.86s
384:	learn: 8604.2144911	total: 2.41s	remaining: 3.85s
385:	learn: 8601.6610609	total: 2.42s	remaining: 3.85s
386:	learn: 8600.3671647	total: 2.42s	remaining: 3.84s
387:	learn: 8597.1195066	total: 2.43s	remaining: 3.83s
388:	learn: 8594.1771521	total: 2.44s	remaining: 3.83s
389:	learn: 8592.0235941	total: 2.44s	remaining: 3.83s
390:	learn: 8587.6383426	total: 2.45s	remaining: 3.82s
391:	learn: 8584.7373006	total: 2.46s	remaining: 3.82s
392:	learn: 8582.6904428	total: 2.48s	remaining: 3.82s
393:	learn: 8580.1095111	total: 2.48s	remaining: 3.82s
394:	learn: 8577.3715582	total: 2.49s	remaining: 3.81s
395:	learn: 8575.1812212	total: 2.49s	remaining: 3.8s
396:	learn: 8572.4696668	total: 2.5s	remaining: 3.8s
397:	learn: 8569.4389842	total: 2.51s	remaining: 3.79s
398:	learn: 8566.6857533	total: 2.51s	remaining: 3.78s
399:	learn: 8564.2503956	total: 2.52s	remaining: 3.78s
400:	learn: 8560.6258409	total: 2.52s	remaining: 3.77s
401:	learn: 8559.0697048	total: 2.53s	remaining: 3.76s
402:	learn: 8555.5863296	total: 2.54s	remaining: 3.76s
403:	learn: 8554.1535072	total: 2.54s	remaining: 3.75s
404:	learn: 8553.0130115	total: 2.55s	remaining: 3.74s
405:	learn: 8549.8114412	total: 2.55s	remaining: 3.73s
406:	learn: 8547.5857310	total: 2.56s	remaining: 3.73s
407:	learn: 8544.4095688	total: 2.56s	remaining: 3.72s
408:	learn: 8539.0726398	total: 2.57s	remaining: 3.71s
409:	learn: 8537.5237208	total: 2.58s	remaining: 3.71s
410:	learn: 8531.4121544	total: 2.58s	remaining: 3.7s
411:	learn: 8528.9643986	total: 2.59s	remaining: 3.69s
412:	learn: 8526.6938612	total: 2.59s	remaining: 3.69s
413:	learn: 8521.1534232	total: 2.6s	remaining: 3.68s
414:	learn: 8517.0921891	total: 2.61s	remaining: 3.68s
415:	learn: 8514.2198586	total: 2.61s	remaining: 3.67s
416:	learn: 8512.6974935	total: 2.62s	remaining: 3.66s
417:	learn: 8508.4683439	total: 2.63s	remaining: 3.65s
418:	learn: 8506.0760220	total: 2.63s	remaining: 3.65s
419:	learn: 8501.1555878	total: 2.64s	remaining: 3.64s
420:	learn: 8500.6978861	total: 2.64s	remaining: 3.63s
421:	learn: 8496.6198004	total: 2.65s	remaining: 3.63s
422:	learn: 8493.3185140	total: 2.65s	remaining: 3.62s
423:	learn: 8490.2693498	total: 2.66s	remaining: 3.61s
424:	learn: 8485.2981310	total: 2.66s	remaining: 3.6s
425:	learn: 8482.8887393	total: 2.67s	remaining: 3.6s
426:	learn: 8480.1354469	total: 2.67s	remaining: 3.59s
427:	learn: 8477.5141958	total: 2.68s	remaining: 3.58s
428:	learn: 8474.6327453	total: 2.69s	remaining: 3.58s
429:	learn: 8469.9596070	total: 2.69s	remaining: 3.57s
430:	learn: 8468.6707321	total: 2.7s	remaining: 3.56s
431:	learn: 8466.8663117	total: 2.7s	remaining: 3.56s
432:	learn: 8465.6487401	total: 2.71s	remaining: 3.55s
433:	learn: 8464.7131799	total: 2.72s	remaining: 3.54s
434:	learn: 8462.1124610	total: 2.73s	remaining: 3.54s
435:	learn: 8458.9624419	total: 2.74s	remaining: 3.55s
436:	learn: 8455.7453396	total: 2.76s	remaining: 3.56s
437:	learn: 8453.2413624	total: 2.77s	remaining: 3.55s
438:	learn: 8449.5787294	total: 2.77s	remaining: 3.55s
439:	learn: 8447.2596755	total: 2.78s	remaining: 3.54s
440:	learn: 8443.5106977	total: 2.79s	remaining: 3.54s
441:	learn: 8440.4758807	total: 2.8s	remaining: 3.53s
442:	learn: 8438.6893653	total: 2.8s	remaining: 3.52s
443:	learn: 8436.4544454	total: 2.81s	remaining: 3.52s
444:	learn: 8434.8863165	total: 2.81s	remaining: 3.51s
445:	learn: 8432.6911895	total: 2.82s	remaining: 3.51s
446:	learn: 8431.0681001	total: 2.83s	remaining: 3.5s
447:	learn: 8429.4821161	total: 2.83s	remaining: 3.49s
448:	learn: 8428.0964402	total: 2.84s	remaining: 3.48s
449:	learn: 8425.5968276	total: 2.85s	remaining: 3.48s
450:	learn: 8422.8662830	total: 2.85s	remaining: 3.47s
451:	learn: 8421.5223809	total: 2.86s	remaining: 3.46s
452:	learn: 8418.3500620	total: 2.86s	remaining: 3.46s
453:	learn: 8416.1441918	total: 2.87s	remaining: 3.45s
454:	learn: 8413.0097754	total: 2.88s	remaining: 3.44s
455:	learn: 8410.3684985	total: 2.88s	remaining: 3.44s
456:	learn: 8409.2453115	total: 2.89s	remaining: 3.43s
457:	learn: 8407.9763027	total: 2.89s	remaining: 3.42s
458:	learn: 8404.6542086	total: 2.9s	remaining: 3.42s
459:	learn: 8401.8258964	total: 2.9s	remaining: 3.41s
460:	learn: 8399.8860613	total: 2.91s	remaining: 3.4s
461:	learn: 8396.3333539	total: 2.92s	remaining: 3.4s
462:	learn: 8393.8398580	total: 2.92s	remaining: 3.39s
463:	learn: 8391.1141183	total: 2.93s	remaining: 3.38s
464:	learn: 8390.4489199	total: 2.93s	remaining: 3.38s
465:	learn: 8387.2543591	total: 2.94s	remaining: 3.37s
466:	learn: 8385.5868931	total: 2.95s	remaining: 3.36s
467:	learn: 8382.5358554	total: 2.95s	remaining: 3.36s
468:	learn: 8381.2496818	total: 2.96s	remaining: 3.35s
469:	learn: 8378.0631559	total: 2.96s	remaining: 3.34s
470:	learn: 8374.5835957	total: 2.97s	remaining: 3.34s
471:	learn: 8371.6025591	total: 2.98s	remaining: 3.33s
472:	learn: 8369.2747617	total: 2.98s	remaining: 3.32s
473:	learn: 8366.3675258	total: 2.99s	remaining: 3.32s
474:	learn: 8362.6924235	total: 3s	remaining: 3.32s
475:	learn: 8361.3399465	total: 3.01s	remaining: 3.31s
476:	learn: 8358.7463471	total: 3.01s	remaining: 3.3s
477:	learn: 8356.0435643	total: 3.02s	remaining: 3.3s
478:	learn: 8353.5907733	total: 3.03s	remaining: 3.3s
479:	learn: 8352.4836156	total: 3.04s	remaining: 3.29s
480:	learn: 8349.3656236	total: 3.04s	remaining: 3.28s
481:	learn: 8347.7691092	total: 3.05s	remaining: 3.28s
482:	learn: 8346.7330058	total: 3.06s	remaining: 3.27s
483:	learn: 8340.4608967	total: 3.06s	remaining: 3.27s
484:	learn: 8337.9918274	total: 3.07s	remaining: 3.26s
485:	learn: 8335.0995899	total: 3.07s	remaining: 3.25s
486:	learn: 8332.3539388	total: 3.08s	remaining: 3.24s
487:	learn: 8329.7240581	total: 3.08s	remaining: 3.24s
488:	learn: 8327.1950622	total: 3.09s	remaining: 3.23s
489:	learn: 8324.6422392	total: 3.1s	remaining: 3.22s
490:	learn: 8322.1643809	total: 3.1s	remaining: 3.22s
491:	learn: 8320.4318431	total: 3.11s	remaining: 3.21s
492:	learn: 8318.2338603	total: 3.11s	remaining: 3.2s
493:	learn: 8316.0586107	total: 3.12s	remaining: 3.19s
494:	learn: 8312.4255570	total: 3.12s	remaining: 3.19s
495:	learn: 8310.5137166	total: 3.13s	remaining: 3.18s
496:	learn: 8305.3638711	total: 3.14s	remaining: 3.17s
497:	learn: 8301.9112979	total: 3.14s	remaining: 3.17s
498:	learn: 8300.0580858	total: 3.15s	remaining: 3.16s
499:	learn: 8298.5490907	total: 3.15s	remaining: 3.15s
500:	learn: 8297.4031611	total: 3.16s	remaining: 3.15s
501:	learn: 8295.2986217	total: 3.17s	remaining: 3.14s
502:	learn: 8294.1670557	total: 3.17s	remaining: 3.13s
503:	learn: 8291.4370294	total: 3.18s	remaining: 3.13s
504:	learn: 8288.1290208	total: 3.18s	remaining: 3.12s
505:	learn: 8285.6762008	total: 3.19s	remaining: 3.11s
506:	learn: 8282.2370530	total: 3.2s	remaining: 3.11s
507:	learn: 8279.5963851	total: 3.2s	remaining: 3.1s
508:	learn: 8276.3435429	total: 3.21s	remaining: 3.1s
509:	learn: 8274.9730655	total: 3.21s	remaining: 3.09s
510:	learn: 8273.6912370	total: 3.22s	remaining: 3.08s
511:	learn: 8272.6304915	total: 3.23s	remaining: 3.08s
512:	learn: 8271.6184080	total: 3.23s	remaining: 3.07s
513:	learn: 8269.6805580	total: 3.24s	remaining: 3.06s
514:	learn: 8268.4217773	total: 3.24s	remaining: 3.05s
515:	learn: 8264.3140719	total: 3.25s	remaining: 3.05s
516:	learn: 8261.5985781	total: 3.25s	remaining: 3.04s
517:	learn: 8260.1314310	total: 3.26s	remaining: 3.03s
518:	learn: 8256.6095804	total: 3.27s	remaining: 3.03s
519:	learn: 8253.5188809	total: 3.27s	remaining: 3.02s
520:	learn: 8251.2521156	total: 3.28s	remaining: 3.01s
521:	learn: 8247.7562169	total: 3.28s	remaining: 3.01s
522:	learn: 8245.1838524	total: 3.29s	remaining: 3s
523:	learn: 8242.4586720	total: 3.3s	remaining: 2.99s
524:	learn: 8236.9330370	total: 3.3s	remaining: 2.99s
525:	learn: 8234.9228634	total: 3.31s	remaining: 2.98s
526:	learn: 8232.4457312	total: 3.31s	remaining: 2.97s
527:	learn: 8230.1559761	total: 3.32s	remaining: 2.97s
528:	learn: 8227.4331347	total: 3.33s	remaining: 2.96s
529:	learn: 8224.9932712	total: 3.33s	remaining: 2.96s
530:	learn: 8220.1540210	total: 3.34s	remaining: 2.95s
531:	learn: 8218.0693700	total: 3.35s	remaining: 2.94s
532:	learn: 8215.7781057	total: 3.35s	remaining: 2.94s
533:	learn: 8212.9663170	total: 3.36s	remaining: 2.93s
534:	learn: 8211.7003484	total: 3.37s	remaining: 2.92s
535:	learn: 8208.0618215	total: 3.38s	remaining: 2.92s
536:	learn: 8206.5261897	total: 3.39s	remaining: 2.92s
537:	learn: 8204.7971888	total: 3.4s	remaining: 2.92s
538:	learn: 8202.1185200	total: 3.41s	remaining: 2.92s
539:	learn: 8199.0338137	total: 3.44s	remaining: 2.93s
540:	learn: 8197.5965493	total: 3.45s	remaining: 2.93s
541:	learn: 8194.9490381	total: 3.47s	remaining: 2.93s
542:	learn: 8193.0381761	total: 3.48s	remaining: 2.93s
543:	learn: 8187.8352480	total: 3.5s	remaining: 2.94s
544:	learn: 8186.2168112	total: 3.52s	remaining: 2.94s
545:	learn: 8184.5062413	total: 3.54s	remaining: 2.94s
546:	learn: 8182.2446955	total: 3.56s	remaining: 2.95s
547:	learn: 8180.7101047	total: 3.59s	remaining: 2.96s
548:	learn: 8175.7447651	total: 3.6s	remaining: 2.96s
549:	learn: 8173.2706283	total: 3.62s	remaining: 2.96s
550:	learn: 8171.8330173	total: 3.64s	remaining: 2.97s
551:	learn: 8170.1620067	total: 3.66s	remaining: 2.97s
552:	learn: 8167.2346594	total: 3.68s	remaining: 2.98s
553:	learn: 8165.3418575	total: 3.7s	remaining: 2.98s
554:	learn: 8163.6531417	total: 3.72s	remaining: 2.98s
555:	learn: 8161.1252175	total: 3.73s	remaining: 2.98s
556:	learn: 8157.7236525	total: 3.75s	remaining: 2.98s
557:	learn: 8155.6566049	total: 3.77s	remaining: 2.99s
558:	learn: 8151.8225943	total: 3.79s	remaining: 2.99s
559:	learn: 8150.0965347	total: 3.81s	remaining: 3s
560:	learn: 8146.6467366	total: 3.83s	remaining: 3s
561:	learn: 8144.6022262	total: 3.85s	remaining: 3s
562:	learn: 8142.4997933	total: 3.88s	remaining: 3.01s
563:	learn: 8140.1443595	total: 3.9s	remaining: 3.01s
564:	learn: 8136.1737041	total: 3.92s	remaining: 3.01s
565:	learn: 8133.5352147	total: 3.94s	remaining: 3.02s
566:	learn: 8130.7408743	total: 3.95s	remaining: 3.02s
567:	learn: 8127.2908311	total: 3.98s	remaining: 3.02s
568:	learn: 8124.0016262	total: 4s	remaining: 3.03s
569:	learn: 8121.8315543	total: 4.03s	remaining: 3.04s
570:	learn: 8119.9813132	total: 4.05s	remaining: 3.04s
571:	learn: 8117.8212119	total: 4.07s	remaining: 3.04s
572:	learn: 8116.5642227	total: 4.08s	remaining: 3.04s
573:	learn: 8115.0638310	total: 4.1s	remaining: 3.04s
574:	learn: 8112.9192237	total: 4.12s	remaining: 3.05s
575:	learn: 8110.4085920	total: 4.14s	remaining: 3.04s
576:	learn: 8105.5420432	total: 4.16s	remaining: 3.05s
577:	learn: 8101.6655171	total: 4.18s	remaining: 3.05s
578:	learn: 8099.9111011	total: 4.19s	remaining: 3.05s
579:	learn: 8098.0997298	total: 4.22s	remaining: 3.06s
580:	learn: 8095.6052270	total: 4.23s	remaining: 3.05s
581:	learn: 8093.0037690	total: 4.25s	remaining: 3.05s
582:	learn: 8091.6077953	total: 4.26s	remaining: 3.05s
583:	learn: 8088.5287215	total: 4.29s	remaining: 3.05s
584:	learn: 8086.9425679	total: 4.3s	remaining: 3.05s
585:	learn: 8082.7434471	total: 4.32s	remaining: 3.05s
586:	learn: 8081.2032144	total: 4.34s	remaining: 3.05s
587:	learn: 8077.6782239	total: 4.36s	remaining: 3.06s
588:	learn: 8077.2424012	total: 4.38s	remaining: 3.06s
589:	learn: 8074.4079558	total: 4.4s	remaining: 3.06s
590:	learn: 8071.5971694	total: 4.42s	remaining: 3.06s
591:	learn: 8071.1504217	total: 4.45s	remaining: 3.06s
592:	learn: 8066.1872825	total: 4.46s	remaining: 3.06s
593:	learn: 8061.4718749	total: 4.48s	remaining: 3.06s
594:	learn: 8058.8919065	total: 4.5s	remaining: 3.06s
595:	learn: 8058.5398416	total: 4.52s	remaining: 3.06s
596:	learn: 8056.5288520	total: 4.54s	remaining: 3.06s
597:	learn: 8052.2682593	total: 4.55s	remaining: 3.06s
598:	learn: 8050.1366349	total: 4.58s	remaining: 3.06s
599:	learn: 8048.6721973	total: 4.59s	remaining: 3.06s
600:	learn: 8048.2967391	total: 4.61s	remaining: 3.06s
601:	learn: 8047.0745857	total: 4.63s	remaining: 3.06s
602:	learn: 8044.4378606	total: 4.64s	remaining: 3.06s
603:	learn: 8043.4739783	total: 4.66s	remaining: 3.06s
604:	learn: 8041.5200456	total: 4.67s	remaining: 3.05s
605:	learn: 8038.9253810	total: 4.69s	remaining: 3.05s
606:	learn: 8034.0123129	total: 4.71s	remaining: 3.05s
607:	learn: 8032.0460933	total: 4.73s	remaining: 3.05s
608:	learn: 8030.5235849	total: 4.75s	remaining: 3.05s
609:	learn: 8026.8524410	total: 4.76s	remaining: 3.05s
610:	learn: 8022.8220360	total: 4.79s	remaining: 3.05s
611:	learn: 8020.7281261	total: 4.8s	remaining: 3.04s
612:	learn: 8016.1526692	total: 4.83s	remaining: 3.05s
613:	learn: 8015.1594316	total: 4.84s	remaining: 3.04s
614:	learn: 8012.9478639	total: 4.86s	remaining: 3.04s
615:	learn: 8011.6170141	total: 4.88s	remaining: 3.04s
616:	learn: 8007.9423966	total: 4.9s	remaining: 3.04s
617:	learn: 8006.2788413	total: 4.92s	remaining: 3.04s
618:	learn: 8005.0854505	total: 4.93s	remaining: 3.04s
619:	learn: 8004.0232141	total: 4.96s	remaining: 3.04s
620:	learn: 8002.4203728	total: 4.98s	remaining: 3.04s
621:	learn: 8000.0972215	total: 5s	remaining: 3.04s
622:	learn: 7998.9038748	total: 5.02s	remaining: 3.04s
623:	learn: 7997.2755563	total: 5.04s	remaining: 3.04s
624:	learn: 7995.7992982	total: 5.05s	remaining: 3.03s
625:	learn: 7993.2704896	total: 5.08s	remaining: 3.03s
626:	learn: 7989.6257221	total: 5.1s	remaining: 3.03s
627:	learn: 7988.4132001	total: 5.11s	remaining: 3.03s
628:	learn: 7987.2379908	total: 5.13s	remaining: 3.03s
629:	learn: 7986.4345040	total: 5.15s	remaining: 3.02s
630:	learn: 7984.3074895	total: 5.17s	remaining: 3.02s
631:	learn: 7982.6698101	total: 5.19s	remaining: 3.02s
632:	learn: 7979.7748744	total: 5.21s	remaining: 3.02s
633:	learn: 7977.9922671	total: 5.23s	remaining: 3.02s
634:	learn: 7976.4749179	total: 5.25s	remaining: 3.02s
635:	learn: 7975.6434457	total: 5.26s	remaining: 3.01s
636:	learn: 7974.8379000	total: 5.29s	remaining: 3.01s
637:	learn: 7970.7545752	total: 5.3s	remaining: 3.01s
638:	learn: 7969.8112258	total: 5.33s	remaining: 3.01s
639:	learn: 7966.4200854	total: 5.35s	remaining: 3.01s
640:	learn: 7963.6823807	total: 5.37s	remaining: 3s
641:	learn: 7960.4002854	total: 5.38s	remaining: 3s
642:	learn: 7958.2999155	total: 5.4s	remaining: 3s
643:	learn: 7954.5832558	total: 5.42s	remaining: 3s
644:	learn: 7952.9553667	total: 5.44s	remaining: 3s
645:	learn: 7948.9130811	total: 5.46s	remaining: 2.99s
646:	learn: 7948.1830192	total: 5.48s	remaining: 2.99s
647:	learn: 7947.3368115	total: 5.5s	remaining: 2.99s
648:	learn: 7944.3952883	total: 5.52s	remaining: 2.98s
649:	learn: 7943.0077619	total: 5.53s	remaining: 2.98s
650:	learn: 7942.1876355	total: 5.55s	remaining: 2.97s
651:	learn: 7940.6311463	total: 5.57s	remaining: 2.97s
652:	learn: 7939.7468381	total: 5.59s	remaining: 2.97s
653:	learn: 7938.9565480	total: 5.6s	remaining: 2.96s
654:	learn: 7937.2087780	total: 5.62s	remaining: 2.96s
655:	learn: 7936.0470719	total: 5.63s	remaining: 2.95s
656:	learn: 7932.9537147	total: 5.66s	remaining: 2.95s
657:	learn: 7931.0602651	total: 5.68s	remaining: 2.95s
658:	learn: 7928.0141286	total: 5.69s	remaining: 2.94s
659:	learn: 7925.8068100	total: 5.71s	remaining: 2.94s
660:	learn: 7923.9037258	total: 5.73s	remaining: 2.94s
661:	learn: 7922.1685797	total: 5.75s	remaining: 2.94s
662:	learn: 7920.3820871	total: 5.77s	remaining: 2.93s
663:	learn: 7919.3204986	total: 5.79s	remaining: 2.93s
664:	learn: 7917.7203779	total: 5.8s	remaining: 2.92s
665:	learn: 7915.4313435	total: 5.82s	remaining: 2.92s
666:	learn: 7913.9162066	total: 5.84s	remaining: 2.92s
667:	learn: 7910.1301273	total: 5.87s	remaining: 2.92s
668:	learn: 7908.3365826	total: 5.88s	remaining: 2.91s
669:	learn: 7907.2882106	total: 5.91s	remaining: 2.91s
670:	learn: 7905.5670465	total: 5.92s	remaining: 2.9s
671:	learn: 7903.4921459	total: 5.94s	remaining: 2.9s
672:	learn: 7902.5305673	total: 5.96s	remaining: 2.9s
673:	learn: 7901.3558981	total: 5.98s	remaining: 2.89s
674:	learn: 7900.3678733	total: 5.99s	remaining: 2.88s
675:	learn: 7899.0166673	total: 6s	remaining: 2.88s
676:	learn: 7896.4851364	total: 6.01s	remaining: 2.87s
677:	learn: 7895.4091602	total: 6.01s	remaining: 2.86s
678:	learn: 7889.1726940	total: 6.02s	remaining: 2.85s
679:	learn: 7885.8443871	total: 6.03s	remaining: 2.83s
680:	learn: 7882.4190510	total: 6.03s	remaining: 2.83s
681:	learn: 7880.2491065	total: 6.04s	remaining: 2.81s
682:	learn: 7877.6091190	total: 6.04s	remaining: 2.81s
683:	learn: 7875.1566188	total: 6.05s	remaining: 2.79s
684:	learn: 7872.0834218	total: 6.06s	remaining: 2.78s
685:	learn: 7869.6322635	total: 6.07s	remaining: 2.78s
686:	learn: 7866.9237051	total: 6.07s	remaining: 2.77s
687:	learn: 7865.4004141	total: 6.08s	remaining: 2.76s
688:	learn: 7862.6406132	total: 6.08s	remaining: 2.75s
689:	learn: 7858.3678800	total: 6.09s	remaining: 2.74s
690:	learn: 7855.9473087	total: 6.09s	remaining: 2.73s
691:	learn: 7853.0412079	total: 6.1s	remaining: 2.71s
692:	learn: 7850.2517195	total: 6.11s	remaining: 2.71s
693:	learn: 7848.8555570	total: 6.11s	remaining: 2.7s
694:	learn: 7846.6014331	total: 6.12s	remaining: 2.69s
695:	learn: 7844.1695540	total: 6.13s	remaining: 2.67s
696:	learn: 7840.3360232	total: 6.13s	remaining: 2.67s
697:	learn: 7839.5485611	total: 6.14s	remaining: 2.65s
698:	learn: 7836.0860059	total: 6.14s	remaining: 2.65s
699:	learn: 7835.2522082	total: 6.15s	remaining: 2.63s
700:	learn: 7833.0328485	total: 6.16s	remaining: 2.63s
701:	learn: 7829.8063303	total: 6.16s	remaining: 2.62s
702:	learn: 7829.0151023	total: 6.17s	remaining: 2.6s
703:	learn: 7827.3898113	total: 6.17s	remaining: 2.6s
704:	learn: 7825.5646647	total: 6.18s	remaining: 2.58s
705:	learn: 7822.6993724	total: 6.18s	remaining: 2.58s
706:	learn: 7820.9748326	total: 6.19s	remaining: 2.56s
707:	learn: 7818.1467488	total: 6.2s	remaining: 2.56s
708:	learn: 7816.5959047	total: 6.2s	remaining: 2.54s
709:	learn: 7815.0758646	total: 6.21s	remaining: 2.54s
710:	learn: 7813.6591476	total: 6.21s	remaining: 2.52s
711:	learn: 7812.3568186	total: 6.22s	remaining: 2.51s
712:	learn: 7810.8520114	total: 6.22s	remaining: 2.5s
713:	learn: 7809.4807581	total: 6.23s	remaining: 2.5s
714:	learn: 7808.2171683	total: 6.23s	remaining: 2.48s
715:	learn: 7806.9605137	total: 6.24s	remaining: 2.47s
716:	learn: 7803.0804217	total: 6.25s	remaining: 2.46s
717:	learn: 7801.9313029	total: 6.25s	remaining: 2.46s
718:	learn: 7800.2373984	total: 6.26s	remaining: 2.44s
719:	learn: 7798.1312561	total: 6.27s	remaining: 2.44s
720:	learn: 7797.4580101	total: 6.27s	remaining: 2.43s
721:	learn: 7793.0412231	total: 6.28s	remaining: 2.42s
722:	learn: 7792.3098114	total: 6.28s	remaining: 2.41s
723:	learn: 7791.9609605	total: 6.29s	remaining: 2.4s
724:	learn: 7790.4818847	total: 6.29s	remaining: 2.39s
725:	learn: 7788.8182372	total: 6.3s	remaining: 2.38s
726:	learn: 7788.0879404	total: 6.31s	remaining: 2.37s
727:	learn: 7783.4553132	total: 6.31s	remaining: 2.36s
728:	learn: 7780.8654428	total: 6.32s	remaining: 2.35s
729:	learn: 7777.6768624	total: 6.33s	remaining: 2.34s
730:	learn: 7776.3917037	total: 6.33s	remaining: 2.33s
731:	learn: 7773.5954349	total: 6.34s	remaining: 2.32s
732:	learn: 7772.2993763	total: 6.34s	remaining: 2.31s
733:	learn: 7771.7070859	total: 6.35s	remaining: 2.3s
734:	learn: 7770.7876113	total: 6.35s	remaining: 2.29s
735:	learn: 7769.5092994	total: 6.36s	remaining: 2.28s
736:	learn: 7768.3151920	total: 6.37s	remaining: 2.27s
737:	learn: 7766.8924377	total: 6.37s	remaining: 2.26s
738:	learn: 7766.2209793	total: 6.38s	remaining: 2.25s
739:	learn: 7764.3050607	total: 6.38s	remaining: 2.24s
740:	learn: 7763.1428117	total: 6.39s	remaining: 2.23s
741:	learn: 7762.0979493	total: 6.39s	remaining: 2.22s
742:	learn: 7759.5675864	total: 6.4s	remaining: 2.21s
743:	learn: 7757.3935912	total: 6.4s	remaining: 2.2s
744:	learn: 7755.8562132	total: 6.41s	remaining: 2.19s
745:	learn: 7750.5390659	total: 6.42s	remaining: 2.18s
746:	learn: 7747.9187625	total: 6.42s	remaining: 2.17s
747:	learn: 7746.3071676	total: 6.44s	remaining: 2.17s
748:	learn: 7744.9504681	total: 6.45s	remaining: 2.16s
749:	learn: 7741.2546318	total: 6.46s	remaining: 2.15s
750:	learn: 7739.2673410	total: 6.47s	remaining: 2.15s
751:	learn: 7735.3490246	total: 6.48s	remaining: 2.14s
752:	learn: 7734.6744940	total: 6.48s	remaining: 2.13s
753:	learn: 7733.1291473	total: 6.49s	remaining: 2.12s
754:	learn: 7730.0353343	total: 6.49s	remaining: 2.11s
755:	learn: 7728.6957025	total: 6.5s	remaining: 2.1s
756:	learn: 7726.1853493	total: 6.5s	remaining: 2.09s
757:	learn: 7724.8311325	total: 6.51s	remaining: 2.08s
758:	learn: 7722.5348178	total: 6.52s	remaining: 2.07s
759:	learn: 7720.1675751	total: 6.52s	remaining: 2.06s
760:	learn: 7718.6754555	total: 6.53s	remaining: 2.05s
761:	learn: 7715.7460572	total: 6.54s	remaining: 2.04s
762:	learn: 7713.9796244	total: 6.54s	remaining: 2.03s
763:	learn: 7710.5782215	total: 6.55s	remaining: 2.02s
764:	learn: 7707.8839040	total: 6.55s	remaining: 2.01s
765:	learn: 7707.1610937	total: 6.56s	remaining: 2s
766:	learn: 7705.9142052	total: 6.57s	remaining: 1.99s
767:	learn: 7704.8341105	total: 6.57s	remaining: 1.99s
768:	learn: 7703.6573617	total: 6.58s	remaining: 1.98s
769:	learn: 7702.7964273	total: 6.58s	remaining: 1.97s
770:	learn: 7700.0839730	total: 6.59s	remaining: 1.96s
771:	learn: 7698.1544813	total: 6.59s	remaining: 1.95s
772:	learn: 7697.1857398	total: 6.6s	remaining: 1.94s
773:	learn: 7694.9587971	total: 6.61s	remaining: 1.93s
774:	learn: 7693.8346077	total: 6.61s	remaining: 1.92s
775:	learn: 7691.4596718	total: 6.62s	remaining: 1.91s
776:	learn: 7689.2739898	total: 6.62s	remaining: 1.9s
777:	learn: 7687.2645255	total: 6.63s	remaining: 1.89s
778:	learn: 7685.8943276	total: 6.64s	remaining: 1.88s
779:	learn: 7683.0412353	total: 6.64s	remaining: 1.87s
780:	learn: 7680.9546514	total: 6.65s	remaining: 1.86s
781:	learn: 7677.3281831	total: 6.66s	remaining: 1.85s
782:	learn: 7674.7264031	total: 6.67s	remaining: 1.85s
783:	learn: 7672.8603222	total: 6.67s	remaining: 1.84s
784:	learn: 7671.9848315	total: 6.68s	remaining: 1.83s
785:	learn: 7670.6065091	total: 6.68s	remaining: 1.82s
786:	learn: 7669.1543912	total: 6.69s	remaining: 1.81s
787:	learn: 7668.4770089	total: 6.69s	remaining: 1.8s
788:	learn: 7667.8083203	total: 6.7s	remaining: 1.79s
789:	learn: 7666.4907781	total: 6.71s	remaining: 1.78s
790:	learn: 7665.5508321	total: 6.71s	remaining: 1.77s
791:	learn: 7662.9268196	total: 6.72s	remaining: 1.76s
792:	learn: 7661.7428611	total: 6.72s	remaining: 1.75s
793:	learn: 7660.5544787	total: 6.73s	remaining: 1.75s
794:	learn: 7659.3541941	total: 6.74s	remaining: 1.74s
795:	learn: 7658.2699042	total: 6.75s	remaining: 1.73s
796:	learn: 7656.6271280	total: 6.75s	remaining: 1.72s
797:	learn: 7655.6450922	total: 6.76s	remaining: 1.71s
798:	learn: 7653.6105168	total: 6.76s	remaining: 1.7s
799:	learn: 7652.5287691	total: 6.77s	remaining: 1.69s
800:	learn: 7650.7756955	total: 6.78s	remaining: 1.68s
801:	learn: 7648.5370680	total: 6.78s	remaining: 1.67s
802:	learn: 7647.6207925	total: 6.79s	remaining: 1.67s
803:	learn: 7645.3203214	total: 6.79s	remaining: 1.66s
804:	learn: 7644.0074509	total: 6.8s	remaining: 1.65s
805:	learn: 7642.6897894	total: 6.81s	remaining: 1.64s
806:	learn: 7641.8853778	total: 6.81s	remaining: 1.63s
807:	learn: 7640.5541871	total: 6.82s	remaining: 1.62s
808:	learn: 7639.8347912	total: 6.82s	remaining: 1.61s
809:	learn: 7639.2045482	total: 6.83s	remaining: 1.6s
810:	learn: 7636.4865680	total: 6.84s	remaining: 1.59s
811:	learn: 7633.1733075	total: 6.84s	remaining: 1.58s
812:	learn: 7631.8532546	total: 6.85s	remaining: 1.57s
813:	learn: 7628.8606571	total: 6.86s	remaining: 1.57s
814:	learn: 7626.6360166	total: 6.87s	remaining: 1.56s
815:	learn: 7625.6367588	total: 6.87s	remaining: 1.55s
816:	learn: 7623.9929657	total: 6.88s	remaining: 1.54s
817:	learn: 7622.3319625	total: 6.89s	remaining: 1.53s
818:	learn: 7621.7206423	total: 6.89s	remaining: 1.52s
819:	learn: 7620.0671860	total: 6.9s	remaining: 1.51s
820:	learn: 7616.4064519	total: 6.9s	remaining: 1.5s
821:	learn: 7615.4801117	total: 6.91s	remaining: 1.5s
822:	learn: 7613.8023430	total: 6.92s	remaining: 1.49s
823:	learn: 7612.7398000	total: 6.92s	remaining: 1.48s
824:	learn: 7612.1050966	total: 6.92s	remaining: 1.47s
825:	learn: 7611.4707996	total: 6.93s	remaining: 1.46s
826:	learn: 7609.5130917	total: 6.94s	remaining: 1.45s
827:	learn: 7606.8792532	total: 6.94s	remaining: 1.44s
828:	learn: 7602.5734368	total: 6.95s	remaining: 1.43s
829:	learn: 7599.8484423	total: 6.96s	remaining: 1.42s
830:	learn: 7598.9017699	total: 6.96s	remaining: 1.42s
831:	learn: 7597.8901529	total: 6.97s	remaining: 1.41s
832:	learn: 7597.0711493	total: 6.97s	remaining: 1.4s
833:	learn: 7595.1462382	total: 6.98s	remaining: 1.39s
834:	learn: 7594.6273410	total: 6.98s	remaining: 1.38s
835:	learn: 7592.4162916	total: 6.99s	remaining: 1.37s
836:	learn: 7589.8203954	total: 6.99s	remaining: 1.36s
837:	learn: 7588.5873596	total: 7s	remaining: 1.35s
838:	learn: 7587.1528566	total: 7s	remaining: 1.34s
839:	learn: 7584.4162044	total: 7.01s	remaining: 1.33s
840:	learn: 7582.9249473	total: 7.01s	remaining: 1.33s
841:	learn: 7582.2935429	total: 7.02s	remaining: 1.32s
842:	learn: 7581.8269803	total: 7.03s	remaining: 1.31s
843:	learn: 7579.4246175	total: 7.05s	remaining: 1.3s
844:	learn: 7578.9517613	total: 7.05s	remaining: 1.29s
845:	learn: 7578.4846546	total: 7.06s	remaining: 1.28s
846:	learn: 7577.6190561	total: 7.07s	remaining: 1.28s
847:	learn: 7576.1803597	total: 7.07s	remaining: 1.27s
848:	learn: 7575.4557132	total: 7.08s	remaining: 1.26s
849:	learn: 7573.5035204	total: 7.09s	remaining: 1.25s
850:	learn: 7572.7858175	total: 7.09s	remaining: 1.24s
851:	learn: 7571.9785817	total: 7.1s	remaining: 1.23s
852:	learn: 7571.4146253	total: 7.1s	remaining: 1.22s
853:	learn: 7570.8778269	total: 7.11s	remaining: 1.22s
854:	learn: 7569.8078442	total: 7.11s	remaining: 1.21s
855:	learn: 7569.0770163	total: 7.12s	remaining: 1.2s
856:	learn: 7565.8151245	total: 7.12s	remaining: 1.19s
857:	learn: 7563.8313099	total: 7.13s	remaining: 1.18s
858:	learn: 7561.6278462	total: 7.14s	remaining: 1.17s
859:	learn: 7561.0564380	total: 7.14s	remaining: 1.16s
860:	learn: 7560.7015518	total: 7.15s	remaining: 1.15s
861:	learn: 7558.0855786	total: 7.15s	remaining: 1.15s
862:	learn: 7555.9746819	total: 7.16s	remaining: 1.14s
863:	learn: 7552.1423961	total: 7.16s	remaining: 1.13s
864:	learn: 7550.8736496	total: 7.17s	remaining: 1.12s
865:	learn: 7548.9185446	total: 7.18s	remaining: 1.11s
866:	learn: 7544.5405478	total: 7.18s	remaining: 1.1s
867:	learn: 7542.9268518	total: 7.19s	remaining: 1.09s
868:	learn: 7540.4516998	total: 7.19s	remaining: 1.08s
869:	learn: 7536.3917866	total: 7.2s	remaining: 1.07s
870:	learn: 7535.2875053	total: 7.2s	remaining: 1.07s
871:	learn: 7532.8341789	total: 7.21s	remaining: 1.06s
872:	learn: 7529.3812144	total: 7.21s	remaining: 1.05s
873:	learn: 7527.3077158	total: 7.22s	remaining: 1.04s
874:	learn: 7522.6433044	total: 7.23s	remaining: 1.03s
875:	learn: 7521.1043413	total: 7.23s	remaining: 1.02s
876:	learn: 7518.9618164	total: 7.24s	remaining: 1.01s
877:	learn: 7516.0472382	total: 7.25s	remaining: 1.01s
878:	learn: 7514.5637111	total: 7.25s	remaining: 998ms
879:	learn: 7514.1569750	total: 7.26s	remaining: 990ms
880:	learn: 7511.7506545	total: 7.27s	remaining: 982ms
881:	learn: 7509.5923235	total: 7.28s	remaining: 974ms
882:	learn: 7507.5777806	total: 7.28s	remaining: 965ms
883:	learn: 7506.5639578	total: 7.29s	remaining: 957ms
884:	learn: 7504.4761163	total: 7.29s	remaining: 948ms
885:	learn: 7502.9968725	total: 7.3s	remaining: 939ms
886:	learn: 7500.3663722	total: 7.31s	remaining: 931ms
887:	learn: 7498.9334189	total: 7.31s	remaining: 922ms
888:	learn: 7495.8504394	total: 7.32s	remaining: 914ms
889:	learn: 7493.9775543	total: 7.32s	remaining: 905ms
890:	learn: 7491.9042108	total: 7.33s	remaining: 897ms
891:	learn: 7488.5961849	total: 7.33s	remaining: 888ms
892:	learn: 7486.3606140	total: 7.34s	remaining: 880ms
893:	learn: 7484.9063348	total: 7.35s	remaining: 871ms
894:	learn: 7483.7126261	total: 7.35s	remaining: 863ms
895:	learn: 7483.1578468	total: 7.37s	remaining: 855ms
896:	learn: 7480.8635460	total: 7.37s	remaining: 847ms
897:	learn: 7479.5814148	total: 7.38s	remaining: 838ms
898:	learn: 7476.9558575	total: 7.38s	remaining: 830ms
899:	learn: 7473.9064967	total: 7.39s	remaining: 821ms
900:	learn: 7472.1349979	total: 7.4s	remaining: 813ms
901:	learn: 7470.2545058	total: 7.4s	remaining: 804ms
902:	learn: 7467.9633119	total: 7.41s	remaining: 796ms
903:	learn: 7466.6845560	total: 7.42s	remaining: 788ms
904:	learn: 7464.3920152	total: 7.43s	remaining: 780ms
905:	learn: 7463.0748613	total: 7.44s	remaining: 772ms
906:	learn: 7461.0193342	total: 7.45s	remaining: 764ms
907:	learn: 7459.1911543	total: 7.45s	remaining: 755ms
908:	learn: 7457.3435775	total: 7.46s	remaining: 747ms
909:	learn: 7456.3939097	total: 7.46s	remaining: 738ms
910:	learn: 7455.9182959	total: 7.47s	remaining: 730ms
911:	learn: 7453.0649386	total: 7.48s	remaining: 722ms
912:	learn: 7450.7451893	total: 7.49s	remaining: 713ms
913:	learn: 7449.7493419	total: 7.49s	remaining: 705ms
914:	learn: 7448.9378197	total: 7.5s	remaining: 696ms
915:	learn: 7446.0510606	total: 7.5s	remaining: 688ms
916:	learn: 7444.2571267	total: 7.51s	remaining: 680ms
917:	learn: 7443.4918418	total: 7.51s	remaining: 671ms
918:	learn: 7441.2354645	total: 7.52s	remaining: 663ms
919:	learn: 7438.8945183	total: 7.53s	remaining: 654ms
920:	learn: 7434.6636403	total: 7.53s	remaining: 646ms
921:	learn: 7433.9240700	total: 7.54s	remaining: 638ms
922:	learn: 7432.9879272	total: 7.54s	remaining: 629ms
923:	learn: 7431.9574403	total: 7.55s	remaining: 621ms
924:	learn: 7428.2208296	total: 7.55s	remaining: 613ms
925:	learn: 7424.5748352	total: 7.56s	remaining: 604ms
926:	learn: 7422.6597115	total: 7.57s	remaining: 596ms
927:	learn: 7419.2074896	total: 7.57s	remaining: 588ms
928:	learn: 7417.2664335	total: 7.58s	remaining: 579ms
929:	learn: 7415.4078209	total: 7.58s	remaining: 571ms
930:	learn: 7412.9169696	total: 7.59s	remaining: 563ms
931:	learn: 7410.1528246	total: 7.6s	remaining: 554ms
932:	learn: 7408.9950255	total: 7.6s	remaining: 546ms
933:	learn: 7407.1465776	total: 7.61s	remaining: 538ms
934:	learn: 7406.0373544	total: 7.62s	remaining: 529ms
935:	learn: 7405.3543529	total: 7.62s	remaining: 521ms
936:	learn: 7402.3733501	total: 7.63s	remaining: 513ms
937:	learn: 7401.0032205	total: 7.63s	remaining: 505ms
938:	learn: 7400.2561773	total: 7.64s	remaining: 496ms
939:	learn: 7398.4009288	total: 7.65s	remaining: 488ms
940:	learn: 7397.7599076	total: 7.65s	remaining: 480ms
941:	learn: 7396.2338640	total: 7.66s	remaining: 472ms
942:	learn: 7395.6067416	total: 7.66s	remaining: 463ms
943:	learn: 7394.5963950	total: 7.67s	remaining: 455ms
944:	learn: 7391.9253279	total: 7.68s	remaining: 447ms
945:	learn: 7390.9001255	total: 7.69s	remaining: 439ms
946:	learn: 7389.1308637	total: 7.69s	remaining: 430ms
947:	learn: 7388.4273928	total: 7.7s	remaining: 422ms
948:	learn: 7386.7895817	total: 7.7s	remaining: 414ms
949:	learn: 7385.1019853	total: 7.71s	remaining: 406ms
950:	learn: 7383.3271233	total: 7.71s	remaining: 398ms
951:	learn: 7381.8299441	total: 7.72s	remaining: 389ms
952:	learn: 7380.3441658	total: 7.73s	remaining: 381ms
953:	learn: 7377.9615532	total: 7.73s	remaining: 373ms
954:	learn: 7376.9793769	total: 7.74s	remaining: 365ms
955:	learn: 7375.2362622	total: 7.75s	remaining: 357ms
956:	learn: 7372.4060123	total: 7.75s	remaining: 348ms
957:	learn: 7371.4488094	total: 7.76s	remaining: 340ms
958:	learn: 7368.8310878	total: 7.76s	remaining: 332ms
959:	learn: 7365.2003458	total: 7.77s	remaining: 324ms
960:	learn: 7364.5893923	total: 7.78s	remaining: 316ms
961:	learn: 7362.2404327	total: 7.78s	remaining: 307ms
962:	learn: 7360.0488983	total: 7.79s	remaining: 299ms
963:	learn: 7358.2988171	total: 7.79s	remaining: 291ms
964:	learn: 7357.6529553	total: 7.8s	remaining: 283ms
965:	learn: 7356.5977715	total: 7.8s	remaining: 275ms
966:	learn: 7354.3699821	total: 7.81s	remaining: 267ms
967:	learn: 7353.3132547	total: 7.81s	remaining: 258ms
968:	learn: 7352.2668915	total: 7.82s	remaining: 250ms
969:	learn: 7349.6409963	total: 7.83s	remaining: 242ms
970:	learn: 7347.9968595	total: 7.84s	remaining: 234ms
971:	learn: 7345.1701467	total: 7.84s	remaining: 226ms
972:	learn: 7342.9197864	total: 7.85s	remaining: 218ms
973:	learn: 7341.8218771	total: 7.85s	remaining: 210ms
974:	learn: 7341.1219712	total: 7.86s	remaining: 202ms
975:	learn: 7339.3289283	total: 7.87s	remaining: 193ms
976:	learn: 7337.3882598	total: 7.88s	remaining: 185ms
977:	learn: 7335.9150160	total: 7.88s	remaining: 177ms
978:	learn: 7332.8161231	total: 7.89s	remaining: 169ms
979:	learn: 7331.4816804	total: 7.89s	remaining: 161ms
980:	learn: 7330.0184743	total: 7.9s	remaining: 153ms
981:	learn: 7328.6839570	total: 7.91s	remaining: 145ms
982:	learn: 7327.3650628	total: 7.91s	remaining: 137ms
983:	learn: 7326.5319839	total: 7.92s	remaining: 129ms
984:	learn: 7325.5553691	total: 7.92s	remaining: 121ms
985:	learn: 7322.5672156	total: 7.93s	remaining: 113ms
986:	learn: 7322.0445460	total: 7.94s	remaining: 105ms
987:	learn: 7318.2422692	total: 7.94s	remaining: 96.5ms
988:	learn: 7315.4914769	total: 7.95s	remaining: 88.4ms
989:	learn: 7314.3075474	total: 7.95s	remaining: 80.3ms
990:	learn: 7313.8804298	total: 7.96s	remaining: 72.3ms
991:	learn: 7312.6033789	total: 7.96s	remaining: 64.2ms
992:	learn: 7312.2685840	total: 7.97s	remaining: 56.2ms
993:	learn: 7310.8168974	total: 7.98s	remaining: 48.2ms
994:	learn: 7308.4884102	total: 7.98s	remaining: 40.1ms
995:	learn: 7306.2669265	total: 7.99s	remaining: 32.1ms
996:	learn: 7304.5159924	total: 8s	remaining: 24.1ms
997:	learn: 7303.3569574	total: 8s	remaining: 16ms
998:	learn: 7301.8450793	total: 8.01s	remaining: 8.02ms
999:	learn: 7299.6286663	total: 8.01s	remaining: 0us
Out[262]:
<catboost.core.CatBoostRegressor at 0x7f9be4c7a790>

Predicciones.¶

In [263]:
p_HGBR =HGBR2.predict(X_test)
p_CBR  =CBR2.predict(X_test)
In [264]:
tabla_resumen = X_test.copy()
tabla_resumen['respuestas'] = y_test
In [265]:
#Añadimos las predicciones de los modelos a nuestra tabla
tabla_resumen['p_HGBR'] = p_HGBR
tabla_resumen['p_CBR'] = p_CBR
In [266]:
tabla_resumen
Out[266]:
AGLOMERADO Edad NIVEL_ED_2 NIVEL_ED ESTADO CAT_OCUP Cant_Ocup Horas_sem INTENSI Tipo_empr ... CAT_ECON_P CAT_ECON_Q CAT_ECON_R CAT_ECON_S CAT_ECON_T CAT_ECON_U CAT_ECON_W respuestas p_HGBR p_CBR
13517 20 16 4 3 0 0 0 0.0 0 0.0 ... 0 0 0 0 0 0 1 0 52.617634 -195.325696
36330 29 32 7 5 1 2 1 25.0 2 3.0 ... 0 0 0 0 0 0 0 50000 46547.600294 42080.156780
48601 8 39 4 4 1 2 1 30.0 2 3.0 ... 0 0 0 0 0 0 0 54000 45631.026835 44817.282498
29269 6 3 1 0 0 0 0 0.0 0 0.0 ... 0 0 0 0 0 0 1 0 52.617634 -72.917524
15882 33 50 4 3 1 1 1 40.0 2 2.0 ... 0 0 0 0 0 0 0 60000 46862.525697 51288.573215
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
38321 15 33 2 2 1 2 2 24.0 2 2.0 ... 0 0 0 0 0 0 0 30000 21483.130584 23915.275458
39014 5 68 4 4 0 0 0 0.0 0 0.0 ... 0 0 0 0 0 0 1 0 52.617634 121.709987
10365 3 1 1 0 0 0 0 0.0 0 0.0 ... 0 0 0 0 0 0 1 0 52.617634 -46.777807
542 33 73 4 4 0 0 0 0.0 0 0.0 ... 0 0 0 0 0 0 1 0 52.617634 -139.118760
48011 34 16 4 3 0 0 0 0.0 0 0.0 ... 0 0 0 0 0 0 1 0 52.617634 21.815456

11792 rows × 54 columns

Cálculo de métricas.

1) Mean squared error

In [267]:
print('MSE de HGBR')
print(mean_squared_error(y_test,p_HGBR))
print('MSE de CBR')
print(mean_squared_error(y_test,p_CBR))
MSE de HGBR
100275844.15326901
MSE de CBR
97134507.35625592

2) R2

In [268]:
print('R2 de HGBR')
print(r2_score(y_test,p_HGBR))
print('R2 de CBR')
print(r2_score(y_test,p_CBR))
R2 de HGBR
0.8426245916636309
R2 de CBR
0.8475546839039552

3) MAE

In [269]:
print('MAE de HGBR')
print(mean_absolute_error(y_test,p_HGBR))
print('MAE de CBR')
print(mean_absolute_error(y_test,p_CBR))
MAE de HGBR
3771.316125268828
MAE de CBR
3722.7450014878173
In [270]:
fig = px.scatter(tabla_resumen, 
                 x= tabla_resumen.index,
                 y= ['respuestas','p_HGBR','p_CBR'],
                 trendline="lowess", 
                 trendline_options=dict(frac=0.03),
                 title="Data real vs Predicciones")
fig.data = [t for t in fig.data if t.mode == "lines"]
fig.update_traces(showlegend=True) #trendlines have showlegend=False by default
fig.show()

Reemplazamos los valores negativos de las predicciones:

In [271]:
#Reemplazamos por 0 las predicciones negativas de ambos modelos
p_HGBR = np.where((p_HGBR <0 ), 0, p_HGBR)
p_CBR = np.where((p_HGBR <0 ), 0, p_CBR)
In [272]:
#Reemplazamos por 0 las predicciones negativas de ambos modelos en la tabla
tabla_resumen['p_HGBR'] = np.where((tabla_resumen['p_HGBR'] <0 ), 0, tabla_resumen['p_HGBR'])
tabla_resumen['p_CBR'] = np.where((tabla_resumen['p_CBR'] <0 ), 0, tabla_resumen['p_CBR'])
In [273]:
tabla_resumen['p_HGBR'].value_counts()
Out[273]:
52.617634       8062
62.597800        484
0.000000          15
20200.265974       2
11258.851758       2
                ... 
36985.241867       1
15995.463357       1
57252.430351       1
27390.376160       1
21483.130584       1
Name: p_HGBR, Length: 3208, dtype: int64
In [274]:
tabla_resumen['p_CBR'].value_counts()
Out[274]:
0.000000        4299
22.304521         76
41.016162         58
97.193368         57
21.524859         42
                ... 
19557.705456       1
86.215854          1
34799.018828       1
25636.296331       1
121.709987         1
Name: p_CBR, Length: 5042, dtype: int64
In [275]:
tabla_resumen['respuestas'].value_counts()
Out[275]:
0         8625
30000      247
60000      242
40000      233
50000      212
          ... 
10800        1
8100         1
800          1
38500        1
125000       1
Name: respuestas, Length: 188, dtype: int64

Graficamos nuevamente, ahora ya sin los valores negativos:

In [276]:
fig = px.scatter(tabla_resumen, 
                 x= tabla_resumen.index,
                 y= ['respuestas','p_HGBR','p_CBR'],
                 trendline="lowess", 
                 trendline_options=dict(frac=0.03),
                 title="Data real vs Predicciones")
fig.data = [t for t in fig.data if t.mode == "lines"]
fig.update_traces(showlegend=True)
fig.show()

Del gráfico se observa que ambos modelos performan de forma similar. Sin embargo, al analizar la frecuencia de las predicciones, notamos que HistGradientBoostingRegressor presenta muchos valores repetidos, por lo que pareciera ser que tiende al overfitting.

Es por esto que se seleccionará como modelo ganador a Catboost.

Feature importances¶

Determinaremos la importancia de las variables del modelo Catboost, que fue el que mejor performó.

In [277]:
CBR2.feature_importances_
Out[277]:
array([6.54938533e+00, 7.42758336e+00, 1.48349023e+00, 2.68870251e+00,
       8.10468383e-01, 6.43143869e-01, 1.21063874e+01, 1.33957994e+01,
       2.94682077e+00, 8.86148424e-01, 2.39797624e+00, 8.07136810e+00,
       4.19414628e+00, 3.61474570e+00, 2.18719798e+00, 1.39433638e+00,
       8.69135060e-01, 5.02143972e+00, 3.99967853e+00, 1.34605926e+00,
       3.72670041e-01, 5.15919226e-01, 6.10233110e-01, 2.11748861e+00,
       8.02308251e-01, 5.20754134e+00, 4.36818213e-01, 2.00889981e-01,
       8.08156822e-01, 9.32136599e-01, 2.99831303e-03, 1.06131474e-01,
       1.42617634e-01, 2.77687992e-01, 2.15282015e-01, 4.40418390e-01,
       1.20098830e+00, 2.44662259e-01, 4.58068023e-01, 6.81347460e-01,
       1.80493908e-01, 1.40706382e-01, 4.55118785e-01, 9.99016569e-01,
       1.82511454e-01, 1.32244943e-01, 9.87919824e-02, 2.73898730e-03,
       0.00000000e+00, 0.00000000e+00, 0.00000000e+00])
In [278]:
X_train_temp.columns
Out[278]:
Index(['AGLOMERADO', 'Edad', 'NIVEL_ED_2', 'NIVEL_ED', 'ESTADO', 'CAT_OCUP',
       'Cant_Ocup', 'Horas_sem', 'INTENSI', 'Tipo_empr', 'Cod_activ',
       'Tamaño_empr', 'Cod_Ocup', 'Lugar_trab', 'Tamaño_empr_2',
       'CARACTER_OCUP', 'JERARQUIA_OCUP', 'TECNOLOGIA_OCUP',
       'CALIFICACION_OCUP', 'ACTIV_ECON', 'REGION_CUYO', 'REGION_GBA',
       'REGION_NEA', 'REGION_NOA', 'REGION_PAMPEANA', 'REGION_PATAGONIA',
       'MAS_500_N', 'MAS_500_S', 'Sexo_F', 'Sexo_M', 'CAT_ECON_A',
       'CAT_ECON_C', 'CAT_ECON_D', 'CAT_ECON_E', 'CAT_ECON_F', 'CAT_ECON_G',
       'CAT_ECON_H', 'CAT_ECON_I', 'CAT_ECON_J', 'CAT_ECON_K', 'CAT_ECON_L',
       'CAT_ECON_M', 'CAT_ECON_N', 'CAT_ECON_O', 'CAT_ECON_P', 'CAT_ECON_Q',
       'CAT_ECON_R', 'CAT_ECON_S', 'CAT_ECON_T', 'CAT_ECON_U', 'CAT_ECON_W'],
      dtype='object')
In [279]:
feature_imp = pd.Series(CBR2.feature_importances_,index=X_train_temp.columns).sort_values(ascending=False)
feature_imp
Out[279]:
Horas_sem            13.395799
Cant_Ocup            12.106387
Tamaño_empr           8.071368
Edad                  7.427583
AGLOMERADO            6.549385
REGION_PATAGONIA      5.207541
TECNOLOGIA_OCUP       5.021440
Cod_Ocup              4.194146
CALIFICACION_OCUP     3.999679
Lugar_trab            3.614746
INTENSI               2.946821
NIVEL_ED              2.688703
Cod_activ             2.397976
Tamaño_empr_2         2.187198
REGION_NOA            2.117489
NIVEL_ED_2            1.483490
CARACTER_OCUP         1.394336
ACTIV_ECON            1.346059
CAT_ECON_H            1.200988
CAT_ECON_O            0.999017
Sexo_M                0.932137
Tipo_empr             0.886148
JERARQUIA_OCUP        0.869135
ESTADO                0.810468
Sexo_F                0.808157
REGION_PAMPEANA       0.802308
CAT_ECON_K            0.681347
CAT_OCUP              0.643144
REGION_NEA            0.610233
REGION_GBA            0.515919
CAT_ECON_J            0.458068
CAT_ECON_N            0.455119
CAT_ECON_G            0.440418
MAS_500_N             0.436818
REGION_CUYO           0.372670
CAT_ECON_E            0.277688
CAT_ECON_I            0.244662
CAT_ECON_F            0.215282
MAS_500_S             0.200890
CAT_ECON_P            0.182511
CAT_ECON_L            0.180494
CAT_ECON_D            0.142618
CAT_ECON_M            0.140706
CAT_ECON_Q            0.132245
CAT_ECON_C            0.106131
CAT_ECON_R            0.098792
CAT_ECON_A            0.002998
CAT_ECON_S            0.002739
CAT_ECON_T            0.000000
CAT_ECON_U            0.000000
CAT_ECON_W            0.000000
dtype: float64
In [280]:
# Creamos el gráfico de barras horizontales
plt.figure(figsize=(15,15))
sns.barplot(x=feature_imp, y=feature_imp.index)
# Agregamos las etiquetas al gráfico
plt.xlabel('Feature Importance Score')
plt.ylabel('Features')
plt.title("Visualizing Important Features")
plt.legend()
plt.show()
WARNING:matplotlib.legend:No artists with labels found to put in legend.  Note that artists whose label start with an underscore are ignored when legend() is called with no argument.

En el gráfico se observa que aproximadamente la mitad de las variables tienen poca relevancia, siendo en su mayoría aquellas que se crearon a partir de OHE, pertenecientes a la categoría "CAT_ECON" (Categoría económica).

Selección del modelo ganador: Catboost Regressor¶

Realizaremos nuevamente el entrenamiento del modelo Catboost. Esta vez emplearemos la función Pool que provee Catboost, la cual evita que se requiera realizar OHE a las variables categóricas (es decir a aquellas que no son de tipo numérico). Para verificar que no desciendan los valores de las métricas, lo haremos mediante Cross-Validation.

Copia del dataset sin OHE¶

In [281]:
df_2= df_1c.copy()
In [282]:
df_2.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 46241 entries, 0 to 49705
Data columns (total 25 columns):
 #   Column             Non-Null Count  Dtype  
---  ------             --------------  -----  
 0   REGION             46241 non-null  object 
 1   AGLOMERADO         46241 non-null  int64  
 2   MAS_500            46241 non-null  object 
 3   Sexo               46241 non-null  object 
 4   Edad               46241 non-null  int64  
 5   NIVEL_ED_2         46241 non-null  int64  
 6   NIVEL_ED           46241 non-null  int64  
 7   ESTADO             46241 non-null  int64  
 8   CAT_OCUP           46241 non-null  int64  
 9   Cant_Ocup          46241 non-null  int64  
 10  Horas_sem          46241 non-null  float64
 11  INTENSI            46241 non-null  int64  
 12  Tipo_empr          46241 non-null  float64
 13  Cod_activ          46241 non-null  int64  
 14  Tamaño_empr        46241 non-null  int64  
 15  Cod_Ocup           46241 non-null  int64  
 16  Lugar_trab         46241 non-null  int64  
 17  Ingresos           46241 non-null  int64  
 18  Tamaño_empr_2      46241 non-null  int64  
 19  CARACTER_OCUP      46241 non-null  object 
 20  JERARQUIA_OCUP     46241 non-null  int64  
 21  TECNOLOGIA_OCUP    46241 non-null  int64  
 22  CALIFICACION_OCUP  46241 non-null  int64  
 23  ACTIV_ECON         46241 non-null  object 
 24  CAT_ECON           46241 non-null  object 
dtypes: float64(2), int64(17), object(6)
memory usage: 9.2+ MB

Eliminaremos la columna CAT_ECON ya que vimos que todas sus variables derivadas del OHE tuvieron muy baja importancia.

In [283]:
#Eliminamos la columna CAT_ECON, que resultó ser de muy baja importancia
#df_2 = df_2.drop(['CAT_ECON'], axis=1)

Eliminación de outliers¶

Emplearemos nuevamente la técnica de Isolation Forest. Pero para ello será necesario que transformemos momentáneamente las columnas tipo texto a númericas.

In [284]:
df_2[['CARACTER_OCUP','ACTIV_ECON']]=df_2[['CARACTER_OCUP','ACTIV_ECON']].astype(int)
In [285]:
df_2['REGION'].value_counts()
Out[285]:
PAMPEANA     13550
NOA          10725
PATAGONIA     6211
GBA           5747
CUYO          5156
NEA           4852
Name: REGION, dtype: int64
In [286]:
#Asignamos número a los valores de la columna Region
df_2['REGION']=df_2['REGION'].replace({'GBA':5, 'NOA':1, 'NEA':2,
                                          'CUYO':3,'PAMPEANA':4,'PATAGONIA':6})
In [287]:
df_2['MAS_500'].value_counts()
Out[287]:
N    25411
S    20830
Name: MAS_500, dtype: int64
In [288]:
#Asignamos número a los valores de la columna MAS_500
df_2['MAS_500']=df_2['MAS_500'].replace({'N':0, 'S':1})
In [289]:
df_2['Sexo'].value_counts()
Out[289]:
F    24440
M    21801
Name: Sexo, dtype: int64
In [290]:
#Asignamos número a los valores de la columna Sexo
df_2['Sexo']=df_2['Sexo'].replace({'F':1, 'M':2})
In [291]:
df_2['CAT_ECON'].value_counts()
Out[291]:
W    28639
G     3284
O     2298
C     1898
F     1792
P     1541
T     1274
Q     1145
H      780
S      745
N      686
M      526
I      513
R      329
J      289
K      245
E      107
L       76
D       65
A        8
U        1
Name: CAT_ECON, dtype: int64
In [292]:
#Asignamos número a los valores de la columna CAT_ECON
df_2['CAT_ECON']=df_2['CAT_ECON'].replace({'W':21, 'G':20,'O':19,'C':18,'F':17,'P':16,'T':15,'Q':14,
                                           'H':13, 'S':12,'N':11,'M':10,'I':9,'R':8,'J':7,'K':6,
                                           'E':5, 'L':4,'D':3,'A':2,'U':1
                                           })
In [293]:
df_2.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 46241 entries, 0 to 49705
Data columns (total 25 columns):
 #   Column             Non-Null Count  Dtype  
---  ------             --------------  -----  
 0   REGION             46241 non-null  int64  
 1   AGLOMERADO         46241 non-null  int64  
 2   MAS_500            46241 non-null  int64  
 3   Sexo               46241 non-null  int64  
 4   Edad               46241 non-null  int64  
 5   NIVEL_ED_2         46241 non-null  int64  
 6   NIVEL_ED           46241 non-null  int64  
 7   ESTADO             46241 non-null  int64  
 8   CAT_OCUP           46241 non-null  int64  
 9   Cant_Ocup          46241 non-null  int64  
 10  Horas_sem          46241 non-null  float64
 11  INTENSI            46241 non-null  int64  
 12  Tipo_empr          46241 non-null  float64
 13  Cod_activ          46241 non-null  int64  
 14  Tamaño_empr        46241 non-null  int64  
 15  Cod_Ocup           46241 non-null  int64  
 16  Lugar_trab         46241 non-null  int64  
 17  Ingresos           46241 non-null  int64  
 18  Tamaño_empr_2      46241 non-null  int64  
 19  CARACTER_OCUP      46241 non-null  int64  
 20  JERARQUIA_OCUP     46241 non-null  int64  
 21  TECNOLOGIA_OCUP    46241 non-null  int64  
 22  CALIFICACION_OCUP  46241 non-null  int64  
 23  ACTIV_ECON         46241 non-null  int64  
 24  CAT_ECON           46241 non-null  int64  
dtypes: float64(2), int64(23)
memory usage: 9.2 MB

Ahora que todas las columnas son numéricas, podemos aplicar Isolation Forest.

In [294]:
isolation_forest = IsolationForest(contamination=0.15)
In [295]:
isolation_forest.fit(df_2)
Out[295]:
IsolationForest(contamination=0.15)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
IsolationForest(contamination=0.15)
In [296]:
y_outlier = isolation_forest.predict(df_2)
In [297]:
df_2['is_outlier'] = y_outlier
In [298]:
df_2.sample(10)
Out[298]:
REGION AGLOMERADO MAS_500 Sexo Edad NIVEL_ED_2 NIVEL_ED ESTADO CAT_OCUP Cant_Ocup ... Lugar_trab Ingresos Tamaño_empr_2 CARACTER_OCUP JERARQUIA_OCUP TECNOLOGIA_OCUP CALIFICACION_OCUP ACTIV_ECON CAT_ECON is_outlier
10893 1 19 0 1 34 6 6 1 2 1 ... 1 30000 3 41 2 3 3 85 16 1
12394 6 91 0 1 24 4 3 0 2 0 ... 0 0 0 0 0 0 0 0 21 1
44738 4 3 0 1 69 2 2 0 0 0 ... 0 0 0 0 0 0 0 0 21 1
46852 4 30 0 2 3 1 0 0 0 0 ... 0 0 0 0 0 0 0 0 21 1
2496 1 23 1 2 13 4 3 0 0 0 ... 0 0 0 0 0 0 0 0 21 1
18159 4 4 1 2 85 7 5 0 0 0 ... 0 0 0 0 0 0 0 0 21 1
27107 2 15 0 1 45 2 2 1 1 1 ... 6 12000 1 30 1 1 2 48 20 -1
41251 6 17 0 2 52 6 6 1 2 1 ... 1 98000 2 41 2 1 3 85 16 1
43889 3 27 1 2 19 4 3 0 0 0 ... 0 0 0 0 0 0 0 0 21 1
3418 5 33 1 1 9 2 1 0 0 0 ... 0 0 0 0 0 0 0 0 21 1

10 rows × 26 columns

In [299]:
df_2['is_outlier'].value_counts()
Out[299]:
 1    39305
-1     6936
Name: is_outlier, dtype: int64
In [300]:
outliers=  df_2[df_2.is_outlier == -1]
In [301]:
outliers
Out[301]:
REGION AGLOMERADO MAS_500 Sexo Edad NIVEL_ED_2 NIVEL_ED ESTADO CAT_OCUP Cant_Ocup ... Lugar_trab Ingresos Tamaño_empr_2 CARACTER_OCUP JERARQUIA_OCUP TECNOLOGIA_OCUP CALIFICACION_OCUP ACTIV_ECON CAT_ECON is_outlier
0 4 14 0 1 42 6 6 1 2 1 ... 1 150000 2 48 2 1 4 84 19 -1
12 6 31 0 2 56 4 4 1 2 1 ... 1 120000 3 3 4 0 4 84 19 -1
14 6 31 0 2 26 6 6 1 2 0 ... 6 0 2 80 2 3 3 73 10 -1
16 6 31 0 2 65 4 4 1 2 1 ... 6 154000 3 42 2 3 4 72 10 -1
27 4 13 1 1 45 4 3 1 1 1 ... 9 4000 1 33 1 1 1 56 9 -1
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
49683 4 3 0 1 25 2 2 1 2 1 ... 1 20000 2 80 2 1 2 23 18 -1
49693 5 32 1 1 42 8 6 1 1 1 ... 10 100000 1 30 1 3 3 79 11 -1
49695 5 32 1 2 36 7 6 1 2 2 ... 6 98000 2 42 2 3 3 73 10 -1
49704 4 2 1 1 34 6 6 1 2 1 ... 1 50000 4 10 2 3 2 84 19 -1
49705 4 2 1 1 64 2 2 1 2 1 ... 8 30000 1 56 2 1 1 0 21 -1

6936 rows × 26 columns

Eliminaremos estos outliers que indica Isolation Forest.

In [302]:
df_sin_outliers2 = df_2[df_2.is_outlier == 1]

Entrenamiento de Catboost empleando la función Pool y CV¶

In [303]:
#Data a entrenar
df_2 = df_sin_outliers2.drop(['is_outlier'], axis=1)

En primer lugar definimos las variables independientes "X" y la variable objetivo "y".

In [304]:
X2 = df_2.drop(['Ingresos'], axis=1)
y2 = df_2[['Ingresos']]
In [305]:
X2.columns
Out[305]:
Index(['REGION', 'AGLOMERADO', 'MAS_500', 'Sexo', 'Edad', 'NIVEL_ED_2',
       'NIVEL_ED', 'ESTADO', 'CAT_OCUP', 'Cant_Ocup', 'Horas_sem', 'INTENSI',
       'Tipo_empr', 'Cod_activ', 'Tamaño_empr', 'Cod_Ocup', 'Lugar_trab',
       'Tamaño_empr_2', 'CARACTER_OCUP', 'JERARQUIA_OCUP', 'TECNOLOGIA_OCUP',
       'CALIFICACION_OCUP', 'ACTIV_ECON', 'CAT_ECON'],
      dtype='object')

Convertimos a tipo string todas las variables categóricas.

In [306]:
X2[['REGION', 'AGLOMERADO', 'MAS_500', 'Sexo', 'NIVEL_ED_2',
       'NIVEL_ED', 'ESTADO', 'CAT_OCUP', 'INTENSI',
       'Tipo_empr', 'Cod_activ', 'Tamaño_empr', 'Cod_Ocup', 'Lugar_trab',
       'Tamaño_empr_2', 'CARACTER_OCUP', 'JERARQUIA_OCUP', 'TECNOLOGIA_OCUP',
       'CALIFICACION_OCUP', 'ACTIV_ECON','CAT_ECON']] = X2[['REGION', 'AGLOMERADO', 'MAS_500', 'Sexo', 'NIVEL_ED_2',
       'NIVEL_ED', 'ESTADO', 'CAT_OCUP', 'INTENSI',
       'Tipo_empr', 'Cod_activ', 'Tamaño_empr', 'Cod_Ocup', 'Lugar_trab',
       'Tamaño_empr_2', 'CARACTER_OCUP', 'JERARQUIA_OCUP', 'TECNOLOGIA_OCUP',
       'CALIFICACION_OCUP', 'ACTIV_ECON','CAT_ECON']].astype(str)
In [307]:
X2.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 39305 entries, 1 to 49703
Data columns (total 24 columns):
 #   Column             Non-Null Count  Dtype  
---  ------             --------------  -----  
 0   REGION             39305 non-null  object 
 1   AGLOMERADO         39305 non-null  object 
 2   MAS_500            39305 non-null  object 
 3   Sexo               39305 non-null  object 
 4   Edad               39305 non-null  int64  
 5   NIVEL_ED_2         39305 non-null  object 
 6   NIVEL_ED           39305 non-null  object 
 7   ESTADO             39305 non-null  object 
 8   CAT_OCUP           39305 non-null  object 
 9   Cant_Ocup          39305 non-null  int64  
 10  Horas_sem          39305 non-null  float64
 11  INTENSI            39305 non-null  object 
 12  Tipo_empr          39305 non-null  object 
 13  Cod_activ          39305 non-null  object 
 14  Tamaño_empr        39305 non-null  object 
 15  Cod_Ocup           39305 non-null  object 
 16  Lugar_trab         39305 non-null  object 
 17  Tamaño_empr_2      39305 non-null  object 
 18  CARACTER_OCUP      39305 non-null  object 
 19  JERARQUIA_OCUP     39305 non-null  object 
 20  TECNOLOGIA_OCUP    39305 non-null  object 
 21  CALIFICACION_OCUP  39305 non-null  object 
 22  ACTIV_ECON         39305 non-null  object 
 23  CAT_ECON           39305 non-null  object 
dtypes: float64(1), int64(2), object(21)
memory usage: 7.5+ MB

Vemos que solo nos quedaron como numéricas las variables Edad, Cantidad de ocupaciones y Horas semanales trabajadas.

In [308]:
#Lista de columnas categóricas
categoricalcolumns = ['REGION', 'AGLOMERADO', 'MAS_500', 'Sexo', 'NIVEL_ED_2',
       'NIVEL_ED', 'ESTADO', 'CAT_OCUP', 'INTENSI',
       'Tipo_empr', 'Cod_activ', 'Tamaño_empr', 'Cod_Ocup', 'Lugar_trab',
       'Tamaño_empr_2', 'CARACTER_OCUP', 'JERARQUIA_OCUP', 'TECNOLOGIA_OCUP',
       'CALIFICACION_OCUP', 'ACTIV_ECON','CAT_ECON']
print("Names of categorical columns : ", categoricalcolumns)
#Obtener ubicación de las columnas categoricas
cat_features = [X2.columns.get_loc(col) for col in categoricalcolumns]
print("Location of categorical columns : ",cat_features)
Names of categorical columns :  ['REGION', 'AGLOMERADO', 'MAS_500', 'Sexo', 'NIVEL_ED_2', 'NIVEL_ED', 'ESTADO', 'CAT_OCUP', 'INTENSI', 'Tipo_empr', 'Cod_activ', 'Tamaño_empr', 'Cod_Ocup', 'Lugar_trab', 'Tamaño_empr_2', 'CARACTER_OCUP', 'JERARQUIA_OCUP', 'TECNOLOGIA_OCUP', 'CALIFICACION_OCUP', 'ACTIV_ECON', 'CAT_ECON']
Location of categorical columns :  [0, 1, 2, 3, 5, 6, 7, 8, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23]

Pool Object

La función Pool en Catboost combina variables independientes y dependientes (X e y), así como también variables categóricas. Le pasaremos el Pool Object como la data de entrenamiento al método fit(). No necesitamos aplicar OneHotEncoding, dado que cuando construimos el modelo, el objeto pool ya tiene esa información. Crearemos el objeto pool empelando el siguiente código.

In [309]:
# Importamos Pool
from catboost import Pool

#Creamos el objeto pool para el dataset de entrenamiento. Le damos información sobre las columnas
#categóricas al parámetro cat_features
train_data = Pool(data=X2,
                  label=y2,
                  cat_features=cat_features
                 )
In [310]:
#Construimos el modelo
cat_model = CatBoostRegressor(loss_function= 'MAE')

#Lo ubicamos en una lista que emplearemos en Cross-Validation
modelos=[cat_model]
In [311]:
#Corremos la función definida en el apartado "Cross-Validation"
cv_metricas, maes, mses, r2s = cv_comparison(modelos, X2, y2, 6)
Se han truncado las últimas 5000 líneas del flujo de salida.
0:	learn: 11489.0788066	total: 13.6ms	remaining: 13.6s
1:	learn: 11175.5424363	total: 28.2ms	remaining: 14.1s
2:	learn: 10873.2868688	total: 40.7ms	remaining: 13.5s
3:	learn: 10585.1409300	total: 51.7ms	remaining: 12.9s
4:	learn: 10303.9958682	total: 63.9ms	remaining: 12.7s
5:	learn: 10045.6875919	total: 76ms	remaining: 12.6s
6:	learn: 9785.8190909	total: 88.7ms	remaining: 12.6s
7:	learn: 9531.7651353	total: 101ms	remaining: 12.5s
8:	learn: 9286.6501195	total: 113ms	remaining: 12.5s
9:	learn: 9057.5076393	total: 125ms	remaining: 12.4s
10:	learn: 8834.4274489	total: 136ms	remaining: 12.3s
11:	learn: 8623.6778643	total: 149ms	remaining: 12.3s
12:	learn: 8421.8334018	total: 161ms	remaining: 12.2s
13:	learn: 8228.6273244	total: 174ms	remaining: 12.3s
14:	learn: 8034.0002912	total: 187ms	remaining: 12.3s
15:	learn: 7851.8957549	total: 198ms	remaining: 12.2s
16:	learn: 7685.4289765	total: 209ms	remaining: 12.1s
17:	learn: 7529.5276209	total: 225ms	remaining: 12.3s
18:	learn: 7368.0763725	total: 240ms	remaining: 12.4s
19:	learn: 7206.8423914	total: 252ms	remaining: 12.3s
20:	learn: 7064.3173343	total: 262ms	remaining: 12.2s
21:	learn: 6921.0163802	total: 273ms	remaining: 12.1s
22:	learn: 6776.1472313	total: 283ms	remaining: 12s
23:	learn: 6643.7352719	total: 294ms	remaining: 12s
24:	learn: 6523.5988383	total: 307ms	remaining: 12s
25:	learn: 6405.7106397	total: 318ms	remaining: 11.9s
26:	learn: 6288.9385476	total: 330ms	remaining: 11.9s
27:	learn: 6173.2639306	total: 341ms	remaining: 11.8s
28:	learn: 6065.3746170	total: 352ms	remaining: 11.8s
29:	learn: 5960.1111091	total: 363ms	remaining: 11.7s
30:	learn: 5864.3263687	total: 375ms	remaining: 11.7s
31:	learn: 5770.0247139	total: 386ms	remaining: 11.7s
32:	learn: 5682.8244993	total: 399ms	remaining: 11.7s
33:	learn: 5593.3699436	total: 417ms	remaining: 11.9s
34:	learn: 5513.0294801	total: 437ms	remaining: 12s
35:	learn: 5434.0706057	total: 470ms	remaining: 12.6s
36:	learn: 5357.2361836	total: 497ms	remaining: 12.9s
37:	learn: 5283.4052412	total: 516ms	remaining: 13.1s
38:	learn: 5215.1839190	total: 537ms	remaining: 13.2s
39:	learn: 5148.7354776	total: 569ms	remaining: 13.7s
40:	learn: 5086.9853193	total: 601ms	remaining: 14s
41:	learn: 5025.3712940	total: 631ms	remaining: 14.4s
42:	learn: 4969.1229559	total: 673ms	remaining: 15s
43:	learn: 4911.6478185	total: 703ms	remaining: 15.3s
44:	learn: 4856.6325267	total: 738ms	remaining: 15.7s
45:	learn: 4805.7269630	total: 767ms	remaining: 15.9s
46:	learn: 4755.7404860	total: 798ms	remaining: 16.2s
47:	learn: 4710.5335001	total: 830ms	remaining: 16.5s
48:	learn: 4668.8226313	total: 862ms	remaining: 16.7s
49:	learn: 4625.0014822	total: 914ms	remaining: 17.4s
50:	learn: 4587.9080825	total: 943ms	remaining: 17.6s
51:	learn: 4551.8920014	total: 974ms	remaining: 17.8s
52:	learn: 4513.1687513	total: 1s	remaining: 18s
53:	learn: 4479.5542523	total: 1.04s	remaining: 18.2s
54:	learn: 4448.1893859	total: 1.07s	remaining: 18.4s
55:	learn: 4417.7411757	total: 1.1s	remaining: 18.5s
56:	learn: 4386.8869591	total: 1.14s	remaining: 18.8s
57:	learn: 4356.0756286	total: 1.17s	remaining: 19s
58:	learn: 4329.9762808	total: 1.2s	remaining: 19.2s
59:	learn: 4301.4290075	total: 1.24s	remaining: 19.4s
60:	learn: 4274.5651839	total: 1.27s	remaining: 19.6s
61:	learn: 4250.6002398	total: 1.3s	remaining: 19.7s
62:	learn: 4225.5782512	total: 1.34s	remaining: 20s
63:	learn: 4203.6127460	total: 1.38s	remaining: 20.1s
64:	learn: 4181.8983595	total: 1.41s	remaining: 20.2s
65:	learn: 4161.6090534	total: 1.44s	remaining: 20.4s
66:	learn: 4143.5421428	total: 1.47s	remaining: 20.5s
67:	learn: 4122.7321187	total: 1.5s	remaining: 20.6s
68:	learn: 4105.3575186	total: 1.54s	remaining: 20.8s
69:	learn: 4088.3235002	total: 1.58s	remaining: 21s
70:	learn: 4071.1885507	total: 1.61s	remaining: 21.1s
71:	learn: 4055.5868578	total: 1.65s	remaining: 21.3s
72:	learn: 4041.7443991	total: 1.69s	remaining: 21.4s
73:	learn: 4024.9045932	total: 1.72s	remaining: 21.5s
74:	learn: 4009.8481488	total: 1.76s	remaining: 21.7s
75:	learn: 3995.0226937	total: 1.79s	remaining: 21.8s
76:	learn: 3981.4939611	total: 1.82s	remaining: 21.8s
77:	learn: 3968.9476296	total: 1.85s	remaining: 21.9s
78:	learn: 3955.8531788	total: 1.88s	remaining: 21.9s
79:	learn: 3943.8822944	total: 1.91s	remaining: 22s
80:	learn: 3931.3504315	total: 1.94s	remaining: 22s
81:	learn: 3923.3553693	total: 1.98s	remaining: 22.2s
82:	learn: 3912.3459057	total: 2.01s	remaining: 22.2s
83:	learn: 3903.0730923	total: 2.04s	remaining: 22.3s
84:	learn: 3894.0953262	total: 2.07s	remaining: 22.3s
85:	learn: 3883.8582346	total: 2.1s	remaining: 22.3s
86:	learn: 3875.0388969	total: 2.13s	remaining: 22.3s
87:	learn: 3866.1608397	total: 2.16s	remaining: 22.4s
88:	learn: 3857.7222720	total: 2.19s	remaining: 22.5s
89:	learn: 3849.4090852	total: 2.23s	remaining: 22.5s
90:	learn: 3841.3952150	total: 2.25s	remaining: 22.5s
91:	learn: 3833.3482365	total: 2.28s	remaining: 22.5s
92:	learn: 3825.3302328	total: 2.31s	remaining: 22.6s
93:	learn: 3818.1397448	total: 2.34s	remaining: 22.6s
94:	learn: 3810.8316680	total: 2.37s	remaining: 22.6s
95:	learn: 3804.0027643	total: 2.4s	remaining: 22.7s
96:	learn: 3797.4121147	total: 2.43s	remaining: 22.7s
97:	learn: 3791.2562930	total: 2.46s	remaining: 22.7s
98:	learn: 3785.1648938	total: 2.5s	remaining: 22.7s
99:	learn: 3777.9554989	total: 2.53s	remaining: 22.7s
100:	learn: 3772.5955425	total: 2.56s	remaining: 22.8s
101:	learn: 3767.8097753	total: 2.59s	remaining: 22.8s
102:	learn: 3761.8761254	total: 2.62s	remaining: 22.8s
103:	learn: 3756.2981237	total: 2.66s	remaining: 23s
104:	learn: 3751.2027517	total: 2.69s	remaining: 23s
105:	learn: 3745.8365737	total: 2.72s	remaining: 22.9s
106:	learn: 3740.9061449	total: 2.74s	remaining: 22.9s
107:	learn: 3736.1123646	total: 2.77s	remaining: 22.9s
108:	learn: 3731.1032420	total: 2.79s	remaining: 22.8s
109:	learn: 3726.1434771	total: 2.8s	remaining: 22.7s
110:	learn: 3721.2437813	total: 2.81s	remaining: 22.5s
111:	learn: 3716.2096041	total: 2.83s	remaining: 22.5s
112:	learn: 3712.0271646	total: 2.85s	remaining: 22.4s
113:	learn: 3708.2617071	total: 2.86s	remaining: 22.2s
114:	learn: 3703.9480700	total: 2.87s	remaining: 22.1s
115:	learn: 3700.2635782	total: 2.88s	remaining: 22s
116:	learn: 3696.3513797	total: 2.9s	remaining: 21.9s
117:	learn: 3693.0875627	total: 2.91s	remaining: 21.8s
118:	learn: 3689.8867717	total: 2.94s	remaining: 21.8s
119:	learn: 3685.6713416	total: 2.96s	remaining: 21.7s
120:	learn: 3682.4500832	total: 2.99s	remaining: 21.7s
121:	learn: 3679.1823448	total: 3.02s	remaining: 21.7s
122:	learn: 3675.6287949	total: 3.06s	remaining: 21.8s
123:	learn: 3673.0266332	total: 3.09s	remaining: 21.8s
124:	learn: 3669.6460406	total: 3.12s	remaining: 21.8s
125:	learn: 3666.9551430	total: 3.14s	remaining: 21.8s
126:	learn: 3663.3448916	total: 3.17s	remaining: 21.8s
127:	learn: 3660.8228210	total: 3.19s	remaining: 21.8s
128:	learn: 3657.4201046	total: 3.21s	remaining: 21.7s
129:	learn: 3654.1632135	total: 3.23s	remaining: 21.6s
130:	learn: 3650.9962065	total: 3.24s	remaining: 21.5s
131:	learn: 3647.4028515	total: 3.26s	remaining: 21.4s
132:	learn: 3644.8706752	total: 3.27s	remaining: 21.3s
133:	learn: 3641.9017029	total: 3.29s	remaining: 21.2s
134:	learn: 3639.5223587	total: 3.29s	remaining: 21.1s
135:	learn: 3636.6803288	total: 3.3s	remaining: 21s
136:	learn: 3634.4615051	total: 3.31s	remaining: 20.9s
137:	learn: 3632.1368661	total: 3.32s	remaining: 20.7s
138:	learn: 3629.5502155	total: 3.33s	remaining: 20.6s
139:	learn: 3627.4706308	total: 3.34s	remaining: 20.5s
140:	learn: 3624.5508043	total: 3.35s	remaining: 20.4s
141:	learn: 3622.1657303	total: 3.36s	remaining: 20.3s
142:	learn: 3619.8002887	total: 3.37s	remaining: 20.2s
143:	learn: 3617.3686005	total: 3.38s	remaining: 20.1s
144:	learn: 3615.3322526	total: 3.39s	remaining: 20s
145:	learn: 3613.4362805	total: 3.4s	remaining: 19.9s
146:	learn: 3611.1387564	total: 3.41s	remaining: 19.8s
147:	learn: 3609.1677655	total: 3.42s	remaining: 19.7s
148:	learn: 3607.1026664	total: 3.42s	remaining: 19.6s
149:	learn: 3605.1433616	total: 3.43s	remaining: 19.5s
150:	learn: 3602.6918421	total: 3.44s	remaining: 19.3s
151:	learn: 3600.3754268	total: 3.45s	remaining: 19.3s
152:	learn: 3598.3512031	total: 3.47s	remaining: 19.2s
153:	learn: 3597.2197493	total: 3.48s	remaining: 19.1s
154:	learn: 3594.7848135	total: 3.49s	remaining: 19s
155:	learn: 3592.7126420	total: 3.5s	remaining: 18.9s
156:	learn: 3590.8023720	total: 3.5s	remaining: 18.8s
157:	learn: 3589.1657851	total: 3.52s	remaining: 18.8s
158:	learn: 3587.1263862	total: 3.53s	remaining: 18.7s
159:	learn: 3585.1230659	total: 3.54s	remaining: 18.6s
160:	learn: 3583.1815758	total: 3.55s	remaining: 18.5s
161:	learn: 3581.2518375	total: 3.56s	remaining: 18.4s
162:	learn: 3579.7444977	total: 3.57s	remaining: 18.3s
163:	learn: 3577.3734035	total: 3.58s	remaining: 18.2s
164:	learn: 3575.7640031	total: 3.59s	remaining: 18.1s
165:	learn: 3573.3555027	total: 3.6s	remaining: 18.1s
166:	learn: 3571.4452681	total: 3.6s	remaining: 18s
167:	learn: 3569.5292561	total: 3.62s	remaining: 17.9s
168:	learn: 3568.3525518	total: 3.62s	remaining: 17.8s
169:	learn: 3566.0631546	total: 3.63s	remaining: 17.7s
170:	learn: 3564.8708117	total: 3.64s	remaining: 17.7s
171:	learn: 3564.0282425	total: 3.65s	remaining: 17.6s
172:	learn: 3562.2027959	total: 3.66s	remaining: 17.5s
173:	learn: 3560.9388260	total: 3.68s	remaining: 17.5s
174:	learn: 3558.9696506	total: 3.69s	remaining: 17.4s
175:	learn: 3557.2902625	total: 3.7s	remaining: 17.3s
176:	learn: 3555.3778908	total: 3.71s	remaining: 17.3s
177:	learn: 3553.6612373	total: 3.72s	remaining: 17.2s
178:	learn: 3551.9953167	total: 3.73s	remaining: 17.1s
179:	learn: 3550.9268877	total: 3.74s	remaining: 17s
180:	learn: 3549.5428383	total: 3.75s	remaining: 16.9s
181:	learn: 3548.5838563	total: 3.75s	remaining: 16.9s
182:	learn: 3546.5295165	total: 3.76s	remaining: 16.8s
183:	learn: 3545.2861618	total: 3.77s	remaining: 16.7s
184:	learn: 3543.8232245	total: 3.78s	remaining: 16.7s
185:	learn: 3542.1343610	total: 3.79s	remaining: 16.6s
186:	learn: 3540.5611044	total: 3.8s	remaining: 16.5s
187:	learn: 3539.1867826	total: 3.81s	remaining: 16.4s
188:	learn: 3537.6583853	total: 3.82s	remaining: 16.4s
189:	learn: 3535.9235396	total: 3.83s	remaining: 16.3s
190:	learn: 3534.0463056	total: 3.84s	remaining: 16.3s
191:	learn: 3533.2939839	total: 3.85s	remaining: 16.2s
192:	learn: 3531.7002421	total: 3.85s	remaining: 16.1s
193:	learn: 3530.0510220	total: 3.86s	remaining: 16.1s
194:	learn: 3528.3280505	total: 3.87s	remaining: 16s
195:	learn: 3526.9345682	total: 3.89s	remaining: 16s
196:	learn: 3525.8437590	total: 3.9s	remaining: 15.9s
197:	learn: 3524.8163135	total: 3.91s	remaining: 15.9s
198:	learn: 3523.6024630	total: 3.92s	remaining: 15.8s
199:	learn: 3521.9690378	total: 3.93s	remaining: 15.7s
200:	learn: 3520.6996629	total: 3.94s	remaining: 15.7s
201:	learn: 3519.4480192	total: 3.95s	remaining: 15.6s
202:	learn: 3517.9660869	total: 3.96s	remaining: 15.5s
203:	learn: 3516.2056934	total: 3.97s	remaining: 15.5s
204:	learn: 3515.1806902	total: 3.98s	remaining: 15.4s
205:	learn: 3514.3246629	total: 3.99s	remaining: 15.4s
206:	learn: 3513.1410490	total: 4s	remaining: 15.3s
207:	learn: 3511.6072193	total: 4.01s	remaining: 15.3s
208:	learn: 3510.4606261	total: 4.02s	remaining: 15.2s
209:	learn: 3509.4810186	total: 4.03s	remaining: 15.2s
210:	learn: 3508.7712521	total: 4.04s	remaining: 15.1s
211:	learn: 3507.3502437	total: 4.05s	remaining: 15.1s
212:	learn: 3506.4103427	total: 4.06s	remaining: 15s
213:	learn: 3504.9618720	total: 4.07s	remaining: 14.9s
214:	learn: 3503.7188838	total: 4.08s	remaining: 14.9s
215:	learn: 3502.4340180	total: 4.09s	remaining: 14.9s
216:	learn: 3501.0950610	total: 4.1s	remaining: 14.8s
217:	learn: 3499.8156109	total: 4.11s	remaining: 14.7s
218:	learn: 3498.0763801	total: 4.12s	remaining: 14.7s
219:	learn: 3497.5447509	total: 4.13s	remaining: 14.6s
220:	learn: 3496.5249547	total: 4.14s	remaining: 14.6s
221:	learn: 3495.4423899	total: 4.15s	remaining: 14.5s
222:	learn: 3493.9820554	total: 4.16s	remaining: 14.5s
223:	learn: 3493.2229166	total: 4.16s	remaining: 14.4s
224:	learn: 3492.0794961	total: 4.17s	remaining: 14.4s
225:	learn: 3490.7645072	total: 4.18s	remaining: 14.3s
226:	learn: 3489.8583337	total: 4.19s	remaining: 14.3s
227:	learn: 3487.7859857	total: 4.2s	remaining: 14.2s
228:	learn: 3486.9579739	total: 4.21s	remaining: 14.2s
229:	learn: 3486.2592469	total: 4.22s	remaining: 14.1s
230:	learn: 3484.8707420	total: 4.22s	remaining: 14.1s
231:	learn: 3483.9772690	total: 4.23s	remaining: 14s
232:	learn: 3482.9670757	total: 4.24s	remaining: 14s
233:	learn: 3482.2634504	total: 4.25s	remaining: 13.9s
234:	learn: 3481.0049884	total: 4.26s	remaining: 13.9s
235:	learn: 3478.9604632	total: 4.27s	remaining: 13.8s
236:	learn: 3477.9877295	total: 4.29s	remaining: 13.8s
237:	learn: 3477.4934981	total: 4.3s	remaining: 13.8s
238:	learn: 3475.9084528	total: 4.31s	remaining: 13.7s
239:	learn: 3475.2625416	total: 4.32s	remaining: 13.7s
240:	learn: 3474.6186821	total: 4.33s	remaining: 13.6s
241:	learn: 3472.8985280	total: 4.34s	remaining: 13.6s
242:	learn: 3472.1215278	total: 4.35s	remaining: 13.5s
243:	learn: 3471.0909148	total: 4.36s	remaining: 13.5s
244:	learn: 3469.6251454	total: 4.37s	remaining: 13.5s
245:	learn: 3468.6320591	total: 4.38s	remaining: 13.4s
246:	learn: 3467.5638017	total: 4.38s	remaining: 13.4s
247:	learn: 3465.5651907	total: 4.39s	remaining: 13.3s
248:	learn: 3464.5653687	total: 4.4s	remaining: 13.3s
249:	learn: 3463.9489634	total: 4.41s	remaining: 13.2s
250:	learn: 3462.7403654	total: 4.42s	remaining: 13.2s
251:	learn: 3461.3662230	total: 4.43s	remaining: 13.1s
252:	learn: 3460.8481582	total: 4.44s	remaining: 13.1s
253:	learn: 3459.7280712	total: 4.45s	remaining: 13.1s
254:	learn: 3458.9419254	total: 4.45s	remaining: 13s
255:	learn: 3457.3303812	total: 4.46s	remaining: 13s
256:	learn: 3456.3947337	total: 4.47s	remaining: 12.9s
257:	learn: 3455.7427512	total: 4.48s	remaining: 12.9s
258:	learn: 3454.7945397	total: 4.5s	remaining: 12.9s
259:	learn: 3454.0738182	total: 4.53s	remaining: 12.9s
260:	learn: 3452.8643482	total: 4.53s	remaining: 12.8s
261:	learn: 3452.1134382	total: 4.54s	remaining: 12.8s
262:	learn: 3451.2302126	total: 4.55s	remaining: 12.8s
263:	learn: 3450.0404983	total: 4.56s	remaining: 12.7s
264:	learn: 3449.1225852	total: 4.57s	remaining: 12.7s
265:	learn: 3448.8718079	total: 4.58s	remaining: 12.6s
266:	learn: 3448.0862064	total: 4.59s	remaining: 12.6s
267:	learn: 3447.3898577	total: 4.6s	remaining: 12.6s
268:	learn: 3446.4915705	total: 4.61s	remaining: 12.5s
269:	learn: 3445.5959413	total: 4.62s	remaining: 12.5s
270:	learn: 3444.6501451	total: 4.63s	remaining: 12.4s
271:	learn: 3444.0483220	total: 4.63s	remaining: 12.4s
272:	learn: 3443.3107340	total: 4.64s	remaining: 12.4s
273:	learn: 3442.3415368	total: 4.65s	remaining: 12.3s
274:	learn: 3441.3199681	total: 4.66s	remaining: 12.3s
275:	learn: 3440.3355939	total: 4.67s	remaining: 12.3s
276:	learn: 3438.5889652	total: 4.68s	remaining: 12.2s
277:	learn: 3438.1804698	total: 4.69s	remaining: 12.2s
278:	learn: 3437.5526117	total: 4.7s	remaining: 12.1s
279:	learn: 3436.8699204	total: 4.72s	remaining: 12.1s
280:	learn: 3435.9666716	total: 4.73s	remaining: 12.1s
281:	learn: 3435.2663509	total: 4.74s	remaining: 12.1s
282:	learn: 3434.1457044	total: 4.75s	remaining: 12s
283:	learn: 3433.2695836	total: 4.76s	remaining: 12s
284:	learn: 3432.4108310	total: 4.77s	remaining: 12s
285:	learn: 3431.9958743	total: 4.78s	remaining: 11.9s
286:	learn: 3431.5086247	total: 4.79s	remaining: 11.9s
287:	learn: 3430.9199831	total: 4.79s	remaining: 11.9s
288:	learn: 3430.1463823	total: 4.8s	remaining: 11.8s
289:	learn: 3429.4166023	total: 4.81s	remaining: 11.8s
290:	learn: 3428.6181679	total: 4.82s	remaining: 11.7s
291:	learn: 3427.7142240	total: 4.83s	remaining: 11.7s
292:	learn: 3426.8471856	total: 4.84s	remaining: 11.7s
293:	learn: 3426.0277890	total: 4.85s	remaining: 11.6s
294:	learn: 3425.4252509	total: 4.85s	remaining: 11.6s
295:	learn: 3424.9664637	total: 4.86s	remaining: 11.6s
296:	learn: 3424.0627903	total: 4.87s	remaining: 11.5s
297:	learn: 3422.7735037	total: 4.88s	remaining: 11.5s
298:	learn: 3421.8313781	total: 4.89s	remaining: 11.5s
299:	learn: 3420.9354418	total: 4.9s	remaining: 11.4s
300:	learn: 3420.0431792	total: 4.92s	remaining: 11.4s
301:	learn: 3419.1399979	total: 4.93s	remaining: 11.4s
302:	learn: 3418.8910286	total: 4.94s	remaining: 11.4s
303:	learn: 3418.3967929	total: 4.95s	remaining: 11.3s
304:	learn: 3416.8373698	total: 4.96s	remaining: 11.3s
305:	learn: 3415.6850614	total: 4.97s	remaining: 11.3s
306:	learn: 3415.0732004	total: 4.97s	remaining: 11.2s
307:	learn: 3414.3425649	total: 4.98s	remaining: 11.2s
308:	learn: 3413.8458936	total: 4.99s	remaining: 11.2s
309:	learn: 3413.2349441	total: 5s	remaining: 11.1s
310:	learn: 3412.7464800	total: 5.01s	remaining: 11.1s
311:	learn: 3412.0171121	total: 5.02s	remaining: 11.1s
312:	learn: 3411.4242732	total: 5.03s	remaining: 11s
313:	learn: 3410.6778953	total: 5.04s	remaining: 11s
314:	learn: 3409.7808896	total: 5.05s	remaining: 11s
315:	learn: 3409.0571997	total: 5.05s	remaining: 10.9s
316:	learn: 3408.5631621	total: 5.06s	remaining: 10.9s
317:	learn: 3407.6912769	total: 5.07s	remaining: 10.9s
318:	learn: 3406.5896583	total: 5.08s	remaining: 10.8s
319:	learn: 3405.8415852	total: 5.09s	remaining: 10.8s
320:	learn: 3405.0613505	total: 5.1s	remaining: 10.8s
321:	learn: 3404.1419445	total: 5.11s	remaining: 10.8s
322:	learn: 3403.1332635	total: 5.14s	remaining: 10.8s
323:	learn: 3402.8201856	total: 5.15s	remaining: 10.7s
324:	learn: 3401.9499553	total: 5.16s	remaining: 10.7s
325:	learn: 3400.7424129	total: 5.16s	remaining: 10.7s
326:	learn: 3400.3264677	total: 5.17s	remaining: 10.6s
327:	learn: 3399.7693131	total: 5.18s	remaining: 10.6s
328:	learn: 3399.3217640	total: 5.19s	remaining: 10.6s
329:	learn: 3398.8429914	total: 5.2s	remaining: 10.6s
330:	learn: 3398.1326889	total: 5.21s	remaining: 10.5s
331:	learn: 3397.7391364	total: 5.21s	remaining: 10.5s
332:	learn: 3397.5385837	total: 5.22s	remaining: 10.5s
333:	learn: 3397.0633394	total: 5.23s	remaining: 10.4s
334:	learn: 3396.4789466	total: 5.24s	remaining: 10.4s
335:	learn: 3395.4821502	total: 5.25s	remaining: 10.4s
336:	learn: 3394.0015346	total: 5.26s	remaining: 10.4s
337:	learn: 3392.9643970	total: 5.27s	remaining: 10.3s
338:	learn: 3392.0316337	total: 5.28s	remaining: 10.3s
339:	learn: 3391.0592307	total: 5.29s	remaining: 10.3s
340:	learn: 3390.2989309	total: 5.3s	remaining: 10.2s
341:	learn: 3389.9554445	total: 5.31s	remaining: 10.2s
342:	learn: 3389.0412330	total: 5.32s	remaining: 10.2s
343:	learn: 3388.4154379	total: 5.32s	remaining: 10.2s
344:	learn: 3387.9693138	total: 5.33s	remaining: 10.1s
345:	learn: 3387.3311689	total: 5.35s	remaining: 10.1s
346:	learn: 3386.6192970	total: 5.36s	remaining: 10.1s
347:	learn: 3385.8874150	total: 5.37s	remaining: 10.1s
348:	learn: 3384.1767447	total: 5.38s	remaining: 10s
349:	learn: 3383.1101803	total: 5.39s	remaining: 10s
350:	learn: 3382.2173148	total: 5.4s	remaining: 9.98s
351:	learn: 3381.1700458	total: 5.41s	remaining: 9.96s
352:	learn: 3380.9018511	total: 5.42s	remaining: 9.94s
353:	learn: 3380.2758187	total: 5.43s	remaining: 9.91s
354:	learn: 3379.6846971	total: 5.44s	remaining: 9.88s
355:	learn: 3379.3738649	total: 5.45s	remaining: 9.86s
356:	learn: 3378.9338986	total: 5.46s	remaining: 9.83s
357:	learn: 3377.8665529	total: 5.47s	remaining: 9.81s
358:	learn: 3376.9820147	total: 5.48s	remaining: 9.78s
359:	learn: 3376.5727483	total: 5.49s	remaining: 9.75s
360:	learn: 3376.0801597	total: 5.5s	remaining: 9.74s
361:	learn: 3375.8136637	total: 5.51s	remaining: 9.72s
362:	learn: 3375.4168106	total: 5.52s	remaining: 9.69s
363:	learn: 3374.3674158	total: 5.53s	remaining: 9.66s
364:	learn: 3374.0864036	total: 5.54s	remaining: 9.65s
365:	learn: 3373.6224882	total: 5.55s	remaining: 9.62s
366:	learn: 3373.1975647	total: 5.56s	remaining: 9.6s
367:	learn: 3372.7129237	total: 5.57s	remaining: 9.57s
368:	learn: 3371.8711339	total: 5.58s	remaining: 9.54s
369:	learn: 3371.2129475	total: 5.59s	remaining: 9.52s
370:	learn: 3370.4840752	total: 5.6s	remaining: 9.49s
371:	learn: 3369.6235045	total: 5.61s	remaining: 9.47s
372:	learn: 3368.8614045	total: 5.62s	remaining: 9.44s
373:	learn: 3368.2117405	total: 5.63s	remaining: 9.41s
374:	learn: 3367.3343579	total: 5.63s	remaining: 9.39s
375:	learn: 3366.7753708	total: 5.64s	remaining: 9.36s
376:	learn: 3366.4554081	total: 5.65s	remaining: 9.34s
377:	learn: 3365.5233506	total: 5.66s	remaining: 9.31s
378:	learn: 3364.5193361	total: 5.67s	remaining: 9.29s
379:	learn: 3363.2784803	total: 5.68s	remaining: 9.27s
380:	learn: 3362.3512589	total: 5.69s	remaining: 9.24s
381:	learn: 3361.5094120	total: 5.7s	remaining: 9.21s
382:	learn: 3360.6346948	total: 5.71s	remaining: 9.19s
383:	learn: 3360.0314161	total: 5.72s	remaining: 9.18s
384:	learn: 3358.9398295	total: 5.73s	remaining: 9.16s
385:	learn: 3358.3870408	total: 5.74s	remaining: 9.14s
386:	learn: 3357.2000899	total: 5.76s	remaining: 9.12s
387:	learn: 3356.3355004	total: 5.77s	remaining: 9.1s
388:	learn: 3355.5572984	total: 5.78s	remaining: 9.07s
389:	learn: 3354.8012315	total: 5.79s	remaining: 9.05s
390:	learn: 3354.4007597	total: 5.79s	remaining: 9.03s
391:	learn: 3353.9488336	total: 5.8s	remaining: 9s
392:	learn: 3353.3800285	total: 5.82s	remaining: 8.98s
393:	learn: 3352.7753582	total: 5.83s	remaining: 8.96s
394:	learn: 3352.2475805	total: 5.84s	remaining: 8.94s
395:	learn: 3351.1168491	total: 5.84s	remaining: 8.91s
396:	learn: 3350.4013688	total: 5.86s	remaining: 8.9s
397:	learn: 3349.4212426	total: 5.87s	remaining: 8.87s
398:	learn: 3348.9115213	total: 5.87s	remaining: 8.85s
399:	learn: 3347.7607176	total: 5.88s	remaining: 8.83s
400:	learn: 3346.8329240	total: 5.89s	remaining: 8.8s
401:	learn: 3345.8225248	total: 5.9s	remaining: 8.78s
402:	learn: 3344.8271147	total: 5.91s	remaining: 8.76s
403:	learn: 3344.1393564	total: 5.92s	remaining: 8.73s
404:	learn: 3343.3271814	total: 5.93s	remaining: 8.71s
405:	learn: 3342.4531547	total: 5.94s	remaining: 8.69s
406:	learn: 3342.1093143	total: 5.95s	remaining: 8.68s
407:	learn: 3341.3597361	total: 5.96s	remaining: 8.65s
408:	learn: 3340.5473981	total: 5.97s	remaining: 8.63s
409:	learn: 3339.5091136	total: 5.98s	remaining: 8.61s
410:	learn: 3338.6739239	total: 5.99s	remaining: 8.59s
411:	learn: 3338.1592195	total: 6s	remaining: 8.56s
412:	learn: 3337.8530816	total: 6.01s	remaining: 8.54s
413:	learn: 3337.5036624	total: 6.02s	remaining: 8.52s
414:	learn: 3336.9051190	total: 6.03s	remaining: 8.5s
415:	learn: 3336.7366891	total: 6.04s	remaining: 8.48s
416:	learn: 3336.3100063	total: 6.05s	remaining: 8.46s
417:	learn: 3336.1372446	total: 6.06s	remaining: 8.43s
418:	learn: 3335.7763058	total: 6.07s	remaining: 8.41s
419:	learn: 3334.4302053	total: 6.08s	remaining: 8.39s
420:	learn: 3333.7284553	total: 6.08s	remaining: 8.37s
421:	learn: 3333.0059808	total: 6.09s	remaining: 8.35s
422:	learn: 3332.4387271	total: 6.1s	remaining: 8.32s
423:	learn: 3331.6318275	total: 6.11s	remaining: 8.3s
424:	learn: 3330.9905682	total: 6.12s	remaining: 8.28s
425:	learn: 3329.7984382	total: 6.13s	remaining: 8.26s
426:	learn: 3329.4792634	total: 6.14s	remaining: 8.24s
427:	learn: 3328.7825530	total: 6.15s	remaining: 8.21s
428:	learn: 3328.0611416	total: 6.17s	remaining: 8.21s
429:	learn: 3327.8404945	total: 6.17s	remaining: 8.19s
430:	learn: 3327.2029942	total: 6.18s	remaining: 8.16s
431:	learn: 3326.4972544	total: 6.19s	remaining: 8.14s
432:	learn: 3325.9911112	total: 6.2s	remaining: 8.12s
433:	learn: 3325.1217527	total: 6.21s	remaining: 8.1s
434:	learn: 3324.4271987	total: 6.22s	remaining: 8.08s
435:	learn: 3323.6531147	total: 6.23s	remaining: 8.06s
436:	learn: 3322.9106227	total: 6.24s	remaining: 8.04s
437:	learn: 3322.1741160	total: 6.25s	remaining: 8.01s
438:	learn: 3321.9551137	total: 6.25s	remaining: 7.99s
439:	learn: 3321.5178449	total: 6.26s	remaining: 7.97s
440:	learn: 3320.5835885	total: 6.27s	remaining: 7.95s
441:	learn: 3320.4295307	total: 6.28s	remaining: 7.93s
442:	learn: 3319.6699256	total: 6.29s	remaining: 7.91s
443:	learn: 3319.6131129	total: 6.29s	remaining: 7.88s
444:	learn: 3319.2964234	total: 6.3s	remaining: 7.86s
445:	learn: 3318.8586034	total: 6.31s	remaining: 7.84s
446:	learn: 3318.3815609	total: 6.32s	remaining: 7.82s
447:	learn: 3317.8281425	total: 6.33s	remaining: 7.8s
448:	learn: 3317.0469847	total: 6.34s	remaining: 7.78s
449:	learn: 3316.3338137	total: 6.35s	remaining: 7.76s
450:	learn: 3315.4953723	total: 6.36s	remaining: 7.74s
451:	learn: 3315.0152998	total: 6.38s	remaining: 7.73s
452:	learn: 3314.4717068	total: 6.39s	remaining: 7.71s
453:	learn: 3313.4984759	total: 6.4s	remaining: 7.69s
454:	learn: 3312.1276820	total: 6.41s	remaining: 7.67s
455:	learn: 3311.5263851	total: 6.42s	remaining: 7.66s
456:	learn: 3311.2835054	total: 6.44s	remaining: 7.65s
457:	learn: 3310.6867095	total: 6.44s	remaining: 7.63s
458:	learn: 3310.1430329	total: 6.45s	remaining: 7.61s
459:	learn: 3309.8157009	total: 6.46s	remaining: 7.59s
460:	learn: 3308.8247004	total: 6.47s	remaining: 7.57s
461:	learn: 3308.0688676	total: 6.48s	remaining: 7.55s
462:	learn: 3307.2834040	total: 6.5s	remaining: 7.54s
463:	learn: 3306.7059262	total: 6.51s	remaining: 7.52s
464:	learn: 3305.7547302	total: 6.52s	remaining: 7.5s
465:	learn: 3304.9923125	total: 6.53s	remaining: 7.48s
466:	learn: 3304.0255174	total: 6.54s	remaining: 7.46s
467:	learn: 3303.4979602	total: 6.54s	remaining: 7.44s
468:	learn: 3302.8121417	total: 6.55s	remaining: 7.42s
469:	learn: 3302.2138987	total: 6.56s	remaining: 7.4s
470:	learn: 3301.6630967	total: 6.57s	remaining: 7.38s
471:	learn: 3301.2575241	total: 6.59s	remaining: 7.37s
472:	learn: 3300.9027739	total: 6.6s	remaining: 7.35s
473:	learn: 3300.2715035	total: 6.6s	remaining: 7.33s
474:	learn: 3299.5740026	total: 6.61s	remaining: 7.31s
475:	learn: 3299.1854283	total: 6.62s	remaining: 7.29s
476:	learn: 3298.4526919	total: 6.63s	remaining: 7.27s
477:	learn: 3297.9525718	total: 6.64s	remaining: 7.25s
478:	learn: 3297.2523321	total: 6.65s	remaining: 7.23s
479:	learn: 3296.8264010	total: 6.66s	remaining: 7.21s
480:	learn: 3296.4440080	total: 6.67s	remaining: 7.19s
481:	learn: 3295.2569685	total: 6.67s	remaining: 7.17s
482:	learn: 3294.7711850	total: 6.68s	remaining: 7.15s
483:	learn: 3294.2802349	total: 6.69s	remaining: 7.13s
484:	learn: 3293.8723188	total: 6.7s	remaining: 7.12s
485:	learn: 3293.3472523	total: 6.71s	remaining: 7.1s
486:	learn: 3293.0044722	total: 6.72s	remaining: 7.08s
487:	learn: 3292.4274622	total: 6.73s	remaining: 7.06s
488:	learn: 3292.3935348	total: 6.74s	remaining: 7.04s
489:	learn: 3291.7557833	total: 6.74s	remaining: 7.02s
490:	learn: 3290.8782873	total: 6.75s	remaining: 7s
491:	learn: 3290.4151669	total: 6.76s	remaining: 6.98s
492:	learn: 3290.2932192	total: 6.77s	remaining: 6.96s
493:	learn: 3289.8168548	total: 6.79s	remaining: 6.95s
494:	learn: 3289.5685033	total: 6.79s	remaining: 6.93s
495:	learn: 3288.7935772	total: 6.8s	remaining: 6.91s
496:	learn: 3287.9439714	total: 6.81s	remaining: 6.89s
497:	learn: 3287.5359405	total: 6.82s	remaining: 6.88s
498:	learn: 3287.1108606	total: 6.83s	remaining: 6.86s
499:	learn: 3286.5149685	total: 6.84s	remaining: 6.84s
500:	learn: 3286.1657863	total: 6.85s	remaining: 6.82s
501:	learn: 3285.9760772	total: 6.86s	remaining: 6.8s
502:	learn: 3285.4681601	total: 6.87s	remaining: 6.78s
503:	learn: 3285.2201426	total: 6.88s	remaining: 6.76s
504:	learn: 3284.8327891	total: 6.88s	remaining: 6.75s
505:	learn: 3284.3648587	total: 6.89s	remaining: 6.73s
506:	learn: 3283.8528372	total: 6.9s	remaining: 6.71s
507:	learn: 3283.1805560	total: 6.91s	remaining: 6.69s
508:	learn: 3282.8226035	total: 6.92s	remaining: 6.67s
509:	learn: 3282.4694585	total: 6.93s	remaining: 6.66s
510:	learn: 3282.0066701	total: 6.94s	remaining: 6.64s
511:	learn: 3281.5133107	total: 6.95s	remaining: 6.62s
512:	learn: 3280.6534264	total: 6.96s	remaining: 6.6s
513:	learn: 3280.0618587	total: 6.96s	remaining: 6.58s
514:	learn: 3279.6617278	total: 6.97s	remaining: 6.57s
515:	learn: 3279.2674527	total: 6.99s	remaining: 6.56s
516:	learn: 3278.8420331	total: 7s	remaining: 6.54s
517:	learn: 3278.3768821	total: 7.01s	remaining: 6.52s
518:	learn: 3277.7354317	total: 7.02s	remaining: 6.5s
519:	learn: 3277.3226564	total: 7.03s	remaining: 6.49s
520:	learn: 3277.1689739	total: 7.04s	remaining: 6.47s
521:	learn: 3276.0812282	total: 7.04s	remaining: 6.45s
522:	learn: 3275.7483310	total: 7.05s	remaining: 6.43s
523:	learn: 3275.2180552	total: 7.06s	remaining: 6.42s
524:	learn: 3274.8480513	total: 7.07s	remaining: 6.4s
525:	learn: 3274.5489099	total: 7.08s	remaining: 6.38s
526:	learn: 3274.2962352	total: 7.09s	remaining: 6.36s
527:	learn: 3274.0214723	total: 7.1s	remaining: 6.35s
528:	learn: 3273.6652701	total: 7.11s	remaining: 6.33s
529:	learn: 3273.0337868	total: 7.13s	remaining: 6.32s
530:	learn: 3272.5918292	total: 7.13s	remaining: 6.3s
531:	learn: 3272.1853892	total: 7.14s	remaining: 6.28s
532:	learn: 3271.1788174	total: 7.15s	remaining: 6.27s
533:	learn: 3270.0241306	total: 7.16s	remaining: 6.25s
534:	learn: 3269.2489428	total: 7.17s	remaining: 6.23s
535:	learn: 3268.6992587	total: 7.19s	remaining: 6.22s
536:	learn: 3267.7897129	total: 7.2s	remaining: 6.2s
537:	learn: 3267.0307152	total: 7.2s	remaining: 6.19s
538:	learn: 3266.6553461	total: 7.21s	remaining: 6.17s
539:	learn: 3266.3993740	total: 7.22s	remaining: 6.15s
540:	learn: 3265.9947597	total: 7.23s	remaining: 6.13s
541:	learn: 3265.3853814	total: 7.24s	remaining: 6.12s
542:	learn: 3264.9893929	total: 7.25s	remaining: 6.1s
543:	learn: 3264.7750179	total: 7.26s	remaining: 6.08s
544:	learn: 3264.3850051	total: 7.27s	remaining: 6.07s
545:	learn: 3263.7060880	total: 7.28s	remaining: 6.05s
546:	learn: 3262.7507516	total: 7.28s	remaining: 6.03s
547:	learn: 3262.3974035	total: 7.29s	remaining: 6.01s
548:	learn: 3261.5707151	total: 7.3s	remaining: 6s
549:	learn: 3261.4856359	total: 7.31s	remaining: 5.98s
550:	learn: 3260.8247263	total: 7.32s	remaining: 5.96s
551:	learn: 3259.9245109	total: 7.33s	remaining: 5.95s
552:	learn: 3259.2643279	total: 7.34s	remaining: 5.93s
553:	learn: 3258.5176100	total: 7.35s	remaining: 5.91s
554:	learn: 3258.2217918	total: 7.35s	remaining: 5.9s
555:	learn: 3257.6536132	total: 7.36s	remaining: 5.88s
556:	learn: 3257.1269968	total: 7.37s	remaining: 5.86s
557:	learn: 3256.8275274	total: 7.38s	remaining: 5.85s
558:	learn: 3256.2492779	total: 7.41s	remaining: 5.84s
559:	learn: 3255.7365592	total: 7.42s	remaining: 5.83s
560:	learn: 3255.4721969	total: 7.42s	remaining: 5.81s
561:	learn: 3254.9850916	total: 7.43s	remaining: 5.79s
562:	learn: 3254.7814109	total: 7.44s	remaining: 5.78s
563:	learn: 3254.4522663	total: 7.45s	remaining: 5.76s
564:	learn: 3253.9825907	total: 7.47s	remaining: 5.75s
565:	learn: 3253.5483885	total: 7.48s	remaining: 5.74s
566:	learn: 3252.8591479	total: 7.49s	remaining: 5.72s
567:	learn: 3252.0826315	total: 7.5s	remaining: 5.7s
568:	learn: 3251.5890976	total: 7.51s	remaining: 5.69s
569:	learn: 3251.3012322	total: 7.52s	remaining: 5.68s
570:	learn: 3250.6829654	total: 7.53s	remaining: 5.66s
571:	learn: 3249.9480757	total: 7.54s	remaining: 5.64s
572:	learn: 3249.5371105	total: 7.55s	remaining: 5.63s
573:	learn: 3248.7911958	total: 7.56s	remaining: 5.61s
574:	learn: 3248.1430255	total: 7.57s	remaining: 5.59s
575:	learn: 3247.8082192	total: 7.58s	remaining: 5.58s
576:	learn: 3247.2627162	total: 7.59s	remaining: 5.56s
577:	learn: 3246.7727761	total: 7.61s	remaining: 5.55s
578:	learn: 3245.8461492	total: 7.62s	remaining: 5.54s
579:	learn: 3245.3572889	total: 7.62s	remaining: 5.52s
580:	learn: 3245.0862791	total: 7.63s	remaining: 5.5s
581:	learn: 3244.8695767	total: 7.64s	remaining: 5.49s
582:	learn: 3244.2733282	total: 7.65s	remaining: 5.47s
583:	learn: 3243.9976818	total: 7.66s	remaining: 5.46s
584:	learn: 3243.8238443	total: 7.67s	remaining: 5.44s
585:	learn: 3243.6298394	total: 7.67s	remaining: 5.42s
586:	learn: 3242.4412360	total: 7.68s	remaining: 5.41s
587:	learn: 3241.3956104	total: 7.69s	remaining: 5.39s
588:	learn: 3240.9274335	total: 7.7s	remaining: 5.38s
589:	learn: 3240.5497442	total: 7.71s	remaining: 5.36s
590:	learn: 3239.9626882	total: 7.72s	remaining: 5.34s
591:	learn: 3239.3802284	total: 7.73s	remaining: 5.33s
592:	learn: 3239.0309694	total: 7.74s	remaining: 5.31s
593:	learn: 3238.7145640	total: 7.75s	remaining: 5.29s
594:	learn: 3238.3178600	total: 7.75s	remaining: 5.28s
595:	learn: 3237.6913113	total: 7.76s	remaining: 5.26s
596:	learn: 3237.4209316	total: 7.77s	remaining: 5.25s
597:	learn: 3236.6614618	total: 7.78s	remaining: 5.23s
598:	learn: 3236.4470664	total: 7.79s	remaining: 5.21s
599:	learn: 3235.9710461	total: 7.81s	remaining: 5.21s
600:	learn: 3235.2177058	total: 7.82s	remaining: 5.19s
601:	learn: 3235.0006978	total: 7.83s	remaining: 5.17s
602:	learn: 3234.1381288	total: 7.84s	remaining: 5.16s
603:	learn: 3233.9227289	total: 7.84s	remaining: 5.14s
604:	learn: 3232.9119598	total: 7.85s	remaining: 5.13s
605:	learn: 3232.5454310	total: 7.86s	remaining: 5.11s
606:	learn: 3232.2563818	total: 7.87s	remaining: 5.09s
607:	learn: 3231.4962183	total: 7.88s	remaining: 5.08s
608:	learn: 3230.9686385	total: 7.89s	remaining: 5.06s
609:	learn: 3230.2038227	total: 7.89s	remaining: 5.05s
610:	learn: 3229.3875305	total: 7.91s	remaining: 5.04s
611:	learn: 3228.7594662	total: 7.92s	remaining: 5.02s
612:	learn: 3228.1831775	total: 7.93s	remaining: 5.01s
613:	learn: 3227.5483639	total: 7.94s	remaining: 4.99s
614:	learn: 3226.9149799	total: 7.95s	remaining: 4.98s
615:	learn: 3226.3366452	total: 7.96s	remaining: 4.96s
616:	learn: 3225.8337134	total: 7.96s	remaining: 4.94s
617:	learn: 3225.4547544	total: 7.97s	remaining: 4.93s
618:	learn: 3224.2928672	total: 7.99s	remaining: 4.91s
619:	learn: 3223.5562414	total: 8s	remaining: 4.9s
620:	learn: 3223.2658948	total: 8.01s	remaining: 4.89s
621:	learn: 3222.6466043	total: 8.02s	remaining: 4.87s
622:	learn: 3222.2317374	total: 8.03s	remaining: 4.86s
623:	learn: 3222.0624489	total: 8.04s	remaining: 4.84s
624:	learn: 3221.9784831	total: 8.04s	remaining: 4.83s
625:	learn: 3221.6804202	total: 8.05s	remaining: 4.81s
626:	learn: 3220.8866383	total: 8.06s	remaining: 4.79s
627:	learn: 3220.5072922	total: 8.07s	remaining: 4.78s
628:	learn: 3220.0990113	total: 8.08s	remaining: 4.76s
629:	learn: 3219.4408846	total: 8.09s	remaining: 4.75s
630:	learn: 3218.8746125	total: 8.1s	remaining: 4.73s
631:	learn: 3218.4465829	total: 8.11s	remaining: 4.72s
632:	learn: 3217.8164571	total: 8.11s	remaining: 4.7s
633:	learn: 3217.0180848	total: 8.12s	remaining: 4.69s
634:	learn: 3216.2564761	total: 8.13s	remaining: 4.67s
635:	learn: 3215.9621193	total: 8.14s	remaining: 4.66s
636:	learn: 3215.4558891	total: 8.15s	remaining: 4.64s
637:	learn: 3215.2163709	total: 8.16s	remaining: 4.63s
638:	learn: 3214.4856155	total: 8.17s	remaining: 4.61s
639:	learn: 3214.3594315	total: 8.17s	remaining: 4.6s
640:	learn: 3213.8221115	total: 8.19s	remaining: 4.58s
641:	learn: 3213.3277610	total: 8.2s	remaining: 4.57s
642:	learn: 3212.8220520	total: 8.21s	remaining: 4.56s
643:	learn: 3212.4339643	total: 8.22s	remaining: 4.54s
644:	learn: 3212.0649770	total: 8.23s	remaining: 4.53s
645:	learn: 3211.0547303	total: 8.24s	remaining: 4.51s
646:	learn: 3210.4452447	total: 8.24s	remaining: 4.5s
647:	learn: 3209.9014024	total: 8.25s	remaining: 4.48s
648:	learn: 3209.6327659	total: 8.26s	remaining: 4.47s
649:	learn: 3209.2624688	total: 8.27s	remaining: 4.45s
650:	learn: 3208.7852612	total: 8.28s	remaining: 4.44s
651:	learn: 3208.1240922	total: 8.29s	remaining: 4.42s
652:	learn: 3207.9389043	total: 8.3s	remaining: 4.41s
653:	learn: 3207.4158278	total: 8.31s	remaining: 4.39s
654:	learn: 3207.2151605	total: 8.32s	remaining: 4.38s
655:	learn: 3206.6446641	total: 8.33s	remaining: 4.37s
656:	learn: 3206.2972409	total: 8.33s	remaining: 4.35s
657:	learn: 3205.7135020	total: 8.34s	remaining: 4.34s
658:	learn: 3205.5150001	total: 8.35s	remaining: 4.32s
659:	learn: 3204.6033780	total: 8.36s	remaining: 4.31s
660:	learn: 3204.1396119	total: 8.37s	remaining: 4.29s
661:	learn: 3203.7258389	total: 8.38s	remaining: 4.28s
662:	learn: 3203.1153030	total: 8.39s	remaining: 4.27s
663:	learn: 3202.9068817	total: 8.4s	remaining: 4.25s
664:	learn: 3202.5697809	total: 8.41s	remaining: 4.24s
665:	learn: 3201.8786419	total: 8.42s	remaining: 4.22s
666:	learn: 3200.9401626	total: 8.43s	remaining: 4.21s
667:	learn: 3200.7112144	total: 8.44s	remaining: 4.19s
668:	learn: 3200.0200790	total: 8.45s	remaining: 4.18s
669:	learn: 3199.6182311	total: 8.46s	remaining: 4.17s
670:	learn: 3199.1007567	total: 8.47s	remaining: 4.15s
671:	learn: 3198.4723061	total: 8.48s	remaining: 4.14s
672:	learn: 3197.9452650	total: 8.49s	remaining: 4.13s
673:	learn: 3197.2972868	total: 8.5s	remaining: 4.11s
674:	learn: 3196.7014289	total: 8.51s	remaining: 4.1s
675:	learn: 3196.2752263	total: 8.52s	remaining: 4.08s
676:	learn: 3195.9485707	total: 8.52s	remaining: 4.07s
677:	learn: 3195.6193849	total: 8.53s	remaining: 4.05s
678:	learn: 3195.4222933	total: 8.54s	remaining: 4.04s
679:	learn: 3194.9463414	total: 8.55s	remaining: 4.02s
680:	learn: 3194.3633132	total: 8.56s	remaining: 4.01s
681:	learn: 3193.5880075	total: 8.57s	remaining: 4s
682:	learn: 3193.1040763	total: 8.58s	remaining: 3.98s
683:	learn: 3192.5561324	total: 8.59s	remaining: 3.97s
684:	learn: 3191.8366401	total: 8.6s	remaining: 3.96s
685:	learn: 3191.5624176	total: 8.61s	remaining: 3.94s
686:	learn: 3191.2808792	total: 8.63s	remaining: 3.93s
687:	learn: 3190.8132638	total: 8.63s	remaining: 3.92s
688:	learn: 3190.0804660	total: 8.64s	remaining: 3.9s
689:	learn: 3189.7028372	total: 8.65s	remaining: 3.89s
690:	learn: 3189.2637177	total: 8.66s	remaining: 3.87s
691:	learn: 3188.8106641	total: 8.67s	remaining: 3.86s
692:	learn: 3188.4791442	total: 8.68s	remaining: 3.85s
693:	learn: 3187.6565298	total: 8.69s	remaining: 3.83s
694:	learn: 3187.2758120	total: 8.7s	remaining: 3.82s
695:	learn: 3187.0216293	total: 8.71s	remaining: 3.8s
696:	learn: 3186.3503578	total: 8.72s	remaining: 3.79s
697:	learn: 3186.1381107	total: 8.73s	remaining: 3.77s
698:	learn: 3185.5491716	total: 8.73s	remaining: 3.76s
699:	learn: 3184.9401706	total: 8.74s	remaining: 3.75s
700:	learn: 3184.4407920	total: 8.75s	remaining: 3.73s
701:	learn: 3184.0580576	total: 8.76s	remaining: 3.72s
702:	learn: 3183.6030671	total: 8.77s	remaining: 3.71s
703:	learn: 3182.7716073	total: 8.78s	remaining: 3.69s
704:	learn: 3182.3260683	total: 8.8s	remaining: 3.68s
705:	learn: 3181.9517327	total: 8.82s	remaining: 3.67s
706:	learn: 3181.7659869	total: 8.82s	remaining: 3.66s
707:	learn: 3181.3139189	total: 8.83s	remaining: 3.64s
708:	learn: 3180.6489538	total: 8.84s	remaining: 3.63s
709:	learn: 3179.9821740	total: 8.85s	remaining: 3.62s
710:	learn: 3179.4551380	total: 8.86s	remaining: 3.6s
711:	learn: 3179.1494830	total: 8.87s	remaining: 3.59s
712:	learn: 3178.4855984	total: 8.88s	remaining: 3.57s
713:	learn: 3178.0448692	total: 8.89s	remaining: 3.56s
714:	learn: 3177.6695156	total: 8.9s	remaining: 3.55s
715:	learn: 3176.9079228	total: 8.9s	remaining: 3.53s
716:	learn: 3176.4197160	total: 8.91s	remaining: 3.52s
717:	learn: 3175.8162106	total: 8.92s	remaining: 3.5s
718:	learn: 3175.2782661	total: 8.93s	remaining: 3.49s
719:	learn: 3175.0767901	total: 8.94s	remaining: 3.48s
720:	learn: 3174.6158612	total: 8.95s	remaining: 3.46s
721:	learn: 3173.8364912	total: 8.96s	remaining: 3.45s
722:	learn: 3173.6002601	total: 8.97s	remaining: 3.44s
723:	learn: 3173.1379807	total: 8.98s	remaining: 3.42s
724:	learn: 3172.6123531	total: 8.99s	remaining: 3.41s
725:	learn: 3172.0260850	total: 8.99s	remaining: 3.39s
726:	learn: 3171.5419373	total: 9.02s	remaining: 3.38s
727:	learn: 3171.3039852	total: 9.03s	remaining: 3.37s
728:	learn: 3170.5738097	total: 9.03s	remaining: 3.36s
729:	learn: 3170.0913522	total: 9.04s	remaining: 3.34s
730:	learn: 3169.2560699	total: 9.05s	remaining: 3.33s
731:	learn: 3168.6244148	total: 9.06s	remaining: 3.32s
732:	learn: 3168.2824561	total: 9.08s	remaining: 3.31s
733:	learn: 3167.8949434	total: 9.09s	remaining: 3.29s
734:	learn: 3167.3362442	total: 9.09s	remaining: 3.28s
735:	learn: 3167.0380820	total: 9.1s	remaining: 3.27s
736:	learn: 3166.6983636	total: 9.11s	remaining: 3.25s
737:	learn: 3166.1801069	total: 9.12s	remaining: 3.24s
738:	learn: 3165.0666188	total: 9.13s	remaining: 3.22s
739:	learn: 3164.6862570	total: 9.14s	remaining: 3.21s
740:	learn: 3164.3873216	total: 9.14s	remaining: 3.2s
741:	learn: 3163.3074444	total: 9.15s	remaining: 3.18s
742:	learn: 3162.5076199	total: 9.16s	remaining: 3.17s
743:	learn: 3161.6908088	total: 9.17s	remaining: 3.16s
744:	learn: 3161.1483083	total: 9.18s	remaining: 3.14s
745:	learn: 3160.8545679	total: 9.19s	remaining: 3.13s
746:	learn: 3160.3109647	total: 9.2s	remaining: 3.12s
747:	learn: 3159.9170418	total: 9.21s	remaining: 3.1s
748:	learn: 3159.5425912	total: 9.23s	remaining: 3.09s
749:	learn: 3158.9916017	total: 9.23s	remaining: 3.08s
750:	learn: 3158.4081758	total: 9.24s	remaining: 3.06s
751:	learn: 3158.0057727	total: 9.25s	remaining: 3.05s
752:	learn: 3157.5724212	total: 9.26s	remaining: 3.04s
753:	learn: 3157.1245968	total: 9.27s	remaining: 3.02s
754:	learn: 3156.7671735	total: 9.28s	remaining: 3.01s
755:	learn: 3156.5299347	total: 9.29s	remaining: 3s
756:	learn: 3156.1166126	total: 9.3s	remaining: 2.98s
757:	learn: 3155.6660857	total: 9.3s	remaining: 2.97s
758:	learn: 3154.8333938	total: 9.31s	remaining: 2.96s
759:	learn: 3154.5994405	total: 9.32s	remaining: 2.94s
760:	learn: 3154.2625891	total: 9.33s	remaining: 2.93s
761:	learn: 3153.8222747	total: 9.34s	remaining: 2.92s
762:	learn: 3153.4067396	total: 9.35s	remaining: 2.9s
763:	learn: 3153.1832008	total: 9.36s	remaining: 2.89s
764:	learn: 3152.9314022	total: 9.37s	remaining: 2.88s
765:	learn: 3152.3741161	total: 9.38s	remaining: 2.86s
766:	learn: 3152.2175950	total: 9.38s	remaining: 2.85s
767:	learn: 3151.7127431	total: 9.39s	remaining: 2.84s
768:	learn: 3151.5325115	total: 9.4s	remaining: 2.82s
769:	learn: 3150.6536618	total: 9.42s	remaining: 2.81s
770:	learn: 3150.2280473	total: 9.44s	remaining: 2.8s
771:	learn: 3149.9096228	total: 9.46s	remaining: 2.79s
772:	learn: 3149.6107470	total: 9.47s	remaining: 2.78s
773:	learn: 3149.3224651	total: 9.48s	remaining: 2.77s
774:	learn: 3149.0811043	total: 9.49s	remaining: 2.75s
775:	learn: 3148.7889230	total: 9.49s	remaining: 2.74s
776:	learn: 3148.6346089	total: 9.5s	remaining: 2.73s
777:	learn: 3148.2920039	total: 9.51s	remaining: 2.71s
778:	learn: 3147.9004006	total: 9.52s	remaining: 2.7s
779:	learn: 3147.5503241	total: 9.53s	remaining: 2.69s
780:	learn: 3147.2491441	total: 9.54s	remaining: 2.67s
781:	learn: 3146.6797161	total: 9.55s	remaining: 2.66s
782:	learn: 3146.3929230	total: 9.55s	remaining: 2.65s
783:	learn: 3146.2923313	total: 9.56s	remaining: 2.63s
784:	learn: 3146.1229582	total: 9.57s	remaining: 2.62s
785:	learn: 3145.8381998	total: 9.58s	remaining: 2.61s
786:	learn: 3145.4585413	total: 9.59s	remaining: 2.6s
787:	learn: 3145.2098256	total: 9.6s	remaining: 2.58s
788:	learn: 3145.0387820	total: 9.61s	remaining: 2.57s
789:	learn: 3144.4924751	total: 9.63s	remaining: 2.56s
790:	learn: 3144.0286801	total: 9.65s	remaining: 2.55s
791:	learn: 3143.8526551	total: 9.66s	remaining: 2.54s
792:	learn: 3143.7850465	total: 9.66s	remaining: 2.52s
793:	learn: 3143.2176742	total: 9.67s	remaining: 2.51s
794:	learn: 3142.2459273	total: 9.68s	remaining: 2.5s
795:	learn: 3141.6670701	total: 9.69s	remaining: 2.48s
796:	learn: 3141.2315833	total: 9.7s	remaining: 2.47s
797:	learn: 3140.8931159	total: 9.71s	remaining: 2.46s
798:	learn: 3140.5185445	total: 9.72s	remaining: 2.44s
799:	learn: 3140.3684480	total: 9.73s	remaining: 2.43s
800:	learn: 3139.6967448	total: 9.74s	remaining: 2.42s
801:	learn: 3139.1235391	total: 9.75s	remaining: 2.41s
802:	learn: 3138.7634438	total: 9.76s	remaining: 2.39s
803:	learn: 3138.4016613	total: 9.77s	remaining: 2.38s
804:	learn: 3138.2865811	total: 9.78s	remaining: 2.37s
805:	learn: 3137.9843152	total: 9.79s	remaining: 2.35s
806:	learn: 3137.5857385	total: 9.79s	remaining: 2.34s
807:	learn: 3137.2595864	total: 9.8s	remaining: 2.33s
808:	learn: 3137.0500312	total: 9.81s	remaining: 2.32s
809:	learn: 3136.9883022	total: 9.82s	remaining: 2.3s
810:	learn: 3136.6610949	total: 9.84s	remaining: 2.29s
811:	learn: 3136.4156992	total: 9.85s	remaining: 2.28s
812:	learn: 3136.2255811	total: 9.86s	remaining: 2.27s
813:	learn: 3136.0829225	total: 9.86s	remaining: 2.25s
814:	learn: 3135.7179628	total: 9.87s	remaining: 2.24s
815:	learn: 3135.4810972	total: 9.88s	remaining: 2.23s
816:	learn: 3134.9914649	total: 9.89s	remaining: 2.21s
817:	learn: 3134.7430168	total: 9.9s	remaining: 2.2s
818:	learn: 3134.4540114	total: 9.91s	remaining: 2.19s
819:	learn: 3134.2465803	total: 9.92s	remaining: 2.18s
820:	learn: 3133.8572518	total: 9.93s	remaining: 2.16s
821:	learn: 3133.5961856	total: 9.94s	remaining: 2.15s
822:	learn: 3133.3646186	total: 9.94s	remaining: 2.14s
823:	learn: 3133.0375013	total: 9.95s	remaining: 2.13s
824:	learn: 3132.8075205	total: 9.96s	remaining: 2.11s
825:	learn: 3132.5902213	total: 9.97s	remaining: 2.1s
826:	learn: 3132.4427408	total: 9.98s	remaining: 2.09s
827:	learn: 3132.2387205	total: 9.99s	remaining: 2.07s
828:	learn: 3131.9035091	total: 9.99s	remaining: 2.06s
829:	learn: 3131.7516651	total: 10s	remaining: 2.05s
830:	learn: 3131.6159398	total: 10s	remaining: 2.04s
831:	learn: 3131.3926941	total: 10s	remaining: 2.02s
832:	learn: 3131.2458750	total: 10s	remaining: 2.01s
833:	learn: 3130.8818550	total: 10s	remaining: 2s
834:	learn: 3130.2964973	total: 10.1s	remaining: 1.99s
835:	learn: 3129.9563310	total: 10.1s	remaining: 1.98s
836:	learn: 3129.2967888	total: 10.1s	remaining: 1.96s
837:	learn: 3129.0666760	total: 10.1s	remaining: 1.95s
838:	learn: 3128.6074943	total: 10.1s	remaining: 1.94s
839:	learn: 3128.1805529	total: 10.1s	remaining: 1.92s
840:	learn: 3128.0042640	total: 10.1s	remaining: 1.91s
841:	learn: 3127.5817678	total: 10.1s	remaining: 1.9s
842:	learn: 3127.2865620	total: 10.1s	remaining: 1.89s
843:	learn: 3126.8920106	total: 10.1s	remaining: 1.87s
844:	learn: 3126.4813911	total: 10.1s	remaining: 1.86s
845:	learn: 3126.1005706	total: 10.2s	remaining: 1.85s
846:	learn: 3125.8262218	total: 10.2s	remaining: 1.84s
847:	learn: 3125.4267885	total: 10.2s	remaining: 1.82s
848:	learn: 3124.6278405	total: 10.2s	remaining: 1.81s
849:	learn: 3124.3284536	total: 10.2s	remaining: 1.8s
850:	learn: 3123.9713552	total: 10.2s	remaining: 1.79s
851:	learn: 3123.7716741	total: 10.2s	remaining: 1.77s
852:	learn: 3123.0861414	total: 10.2s	remaining: 1.76s
853:	learn: 3122.4481236	total: 10.2s	remaining: 1.75s
854:	learn: 3121.6340941	total: 10.2s	remaining: 1.74s
855:	learn: 3121.4070274	total: 10.3s	remaining: 1.73s
856:	learn: 3120.8337780	total: 10.3s	remaining: 1.71s
857:	learn: 3119.9377185	total: 10.3s	remaining: 1.7s
858:	learn: 3119.5794017	total: 10.3s	remaining: 1.69s
859:	learn: 3118.8394972	total: 10.3s	remaining: 1.68s
860:	learn: 3118.0811506	total: 10.3s	remaining: 1.66s
861:	learn: 3117.5553213	total: 10.3s	remaining: 1.65s
862:	learn: 3117.1657699	total: 10.3s	remaining: 1.64s
863:	learn: 3116.6904487	total: 10.3s	remaining: 1.63s
864:	learn: 3115.8987897	total: 10.3s	remaining: 1.61s
865:	learn: 3115.7433464	total: 10.3s	remaining: 1.6s
866:	learn: 3115.3460392	total: 10.4s	remaining: 1.59s
867:	learn: 3114.9803165	total: 10.4s	remaining: 1.58s
868:	learn: 3114.7439911	total: 10.4s	remaining: 1.56s
869:	learn: 3114.5486031	total: 10.4s	remaining: 1.55s
870:	learn: 3114.2200520	total: 10.4s	remaining: 1.54s
871:	learn: 3113.1833675	total: 10.4s	remaining: 1.53s
872:	learn: 3112.6488968	total: 10.4s	remaining: 1.52s
873:	learn: 3112.5570422	total: 10.4s	remaining: 1.5s
874:	learn: 3112.2548825	total: 10.5s	remaining: 1.49s
875:	learn: 3112.0598756	total: 10.5s	remaining: 1.48s
876:	learn: 3111.8507163	total: 10.5s	remaining: 1.47s
877:	learn: 3111.5423180	total: 10.5s	remaining: 1.46s
878:	learn: 3111.0845062	total: 10.5s	remaining: 1.44s
879:	learn: 3110.5549902	total: 10.5s	remaining: 1.43s
880:	learn: 3110.1388895	total: 10.5s	remaining: 1.42s
881:	learn: 3109.6359406	total: 10.5s	remaining: 1.41s
882:	learn: 3109.4636763	total: 10.5s	remaining: 1.4s
883:	learn: 3109.2452448	total: 10.5s	remaining: 1.38s
884:	learn: 3108.9223291	total: 10.6s	remaining: 1.37s
885:	learn: 3108.5862558	total: 10.6s	remaining: 1.36s
886:	learn: 3108.2434344	total: 10.6s	remaining: 1.35s
887:	learn: 3108.0335442	total: 10.6s	remaining: 1.33s
888:	learn: 3107.9503960	total: 10.6s	remaining: 1.32s
889:	learn: 3107.6902336	total: 10.6s	remaining: 1.31s
890:	learn: 3107.3839097	total: 10.6s	remaining: 1.3s
891:	learn: 3107.0922936	total: 10.6s	remaining: 1.28s
892:	learn: 3106.6249360	total: 10.6s	remaining: 1.27s
893:	learn: 3106.1663964	total: 10.6s	remaining: 1.26s
894:	learn: 3105.4465207	total: 10.6s	remaining: 1.25s
895:	learn: 3105.0850818	total: 10.6s	remaining: 1.24s
896:	learn: 3104.7922860	total: 10.7s	remaining: 1.22s
897:	learn: 3104.0242524	total: 10.7s	remaining: 1.21s
898:	learn: 3103.3700903	total: 10.7s	remaining: 1.2s
899:	learn: 3102.9064080	total: 10.7s	remaining: 1.19s
900:	learn: 3102.6515331	total: 10.7s	remaining: 1.18s
901:	learn: 3102.3724820	total: 10.7s	remaining: 1.16s
902:	learn: 3101.7665074	total: 10.7s	remaining: 1.15s
903:	learn: 3101.4948173	total: 10.7s	remaining: 1.14s
904:	learn: 3101.0885825	total: 10.7s	remaining: 1.13s
905:	learn: 3100.8491265	total: 10.7s	remaining: 1.11s
906:	learn: 3100.4516314	total: 10.8s	remaining: 1.1s
907:	learn: 3100.0195737	total: 10.8s	remaining: 1.09s
908:	learn: 3099.7421982	total: 10.8s	remaining: 1.08s
909:	learn: 3099.5412553	total: 10.8s	remaining: 1.07s
910:	learn: 3099.1806710	total: 10.8s	remaining: 1.05s
911:	learn: 3098.8022629	total: 10.8s	remaining: 1.04s
912:	learn: 3098.3163690	total: 10.8s	remaining: 1.03s
913:	learn: 3097.9147788	total: 10.8s	remaining: 1.02s
914:	learn: 3097.4292791	total: 10.8s	remaining: 1.01s
915:	learn: 3097.0574787	total: 10.8s	remaining: 994ms
916:	learn: 3096.8737304	total: 10.8s	remaining: 982ms
917:	learn: 3096.5909920	total: 10.9s	remaining: 970ms
918:	learn: 3096.3506254	total: 10.9s	remaining: 958ms
919:	learn: 3096.0777382	total: 10.9s	remaining: 946ms
920:	learn: 3095.8145540	total: 10.9s	remaining: 934ms
921:	learn: 3095.5910340	total: 10.9s	remaining: 922ms
922:	learn: 3094.8900835	total: 10.9s	remaining: 910ms
923:	learn: 3094.5629666	total: 10.9s	remaining: 898ms
924:	learn: 3094.0198709	total: 10.9s	remaining: 886ms
925:	learn: 3093.7135768	total: 10.9s	remaining: 874ms
926:	learn: 3093.3159003	total: 10.9s	remaining: 862ms
927:	learn: 3092.7925066	total: 11s	remaining: 850ms
928:	learn: 3092.3556849	total: 11s	remaining: 839ms
929:	learn: 3091.7448732	total: 11s	remaining: 827ms
930:	learn: 3091.4831365	total: 11s	remaining: 814ms
931:	learn: 3091.0568065	total: 11s	remaining: 802ms
932:	learn: 3090.7501115	total: 11s	remaining: 790ms
933:	learn: 3090.3385167	total: 11s	remaining: 778ms
934:	learn: 3089.8605002	total: 11s	remaining: 767ms
935:	learn: 3089.4858204	total: 11s	remaining: 755ms
936:	learn: 3089.2436354	total: 11s	remaining: 743ms
937:	learn: 3088.9054352	total: 11.1s	remaining: 731ms
938:	learn: 3088.6836851	total: 11.1s	remaining: 719ms
939:	learn: 3087.7865442	total: 11.1s	remaining: 707ms
940:	learn: 3087.5096523	total: 11.1s	remaining: 695ms
941:	learn: 3087.0457869	total: 11.1s	remaining: 683ms
942:	learn: 3086.4998630	total: 11.1s	remaining: 671ms
943:	learn: 3086.1772889	total: 11.1s	remaining: 659ms
944:	learn: 3085.7597372	total: 11.1s	remaining: 648ms
945:	learn: 3085.5080044	total: 11.1s	remaining: 636ms
946:	learn: 3085.0655084	total: 11.1s	remaining: 624ms
947:	learn: 3084.6523372	total: 11.2s	remaining: 612ms
948:	learn: 3084.2753567	total: 11.2s	remaining: 600ms
949:	learn: 3083.9741658	total: 11.2s	remaining: 588ms
950:	learn: 3083.3753175	total: 11.2s	remaining: 576ms
951:	learn: 3082.8367316	total: 11.2s	remaining: 564ms
952:	learn: 3082.4948468	total: 11.2s	remaining: 552ms
953:	learn: 3082.2646654	total: 11.2s	remaining: 541ms
954:	learn: 3081.2392452	total: 11.2s	remaining: 529ms
955:	learn: 3080.6875585	total: 11.2s	remaining: 517ms
956:	learn: 3080.1313986	total: 11.2s	remaining: 505ms
957:	learn: 3079.8145977	total: 11.2s	remaining: 493ms
958:	learn: 3079.5372392	total: 11.3s	remaining: 481ms
959:	learn: 3079.0940581	total: 11.3s	remaining: 470ms
960:	learn: 3078.6096739	total: 11.3s	remaining: 458ms
961:	learn: 3078.2116834	total: 11.3s	remaining: 447ms
962:	learn: 3077.8143591	total: 11.3s	remaining: 435ms
963:	learn: 3077.4789573	total: 11.3s	remaining: 423ms
964:	learn: 3076.8569762	total: 11.3s	remaining: 411ms
965:	learn: 3076.3258154	total: 11.3s	remaining: 399ms
966:	learn: 3075.8224949	total: 11.4s	remaining: 388ms
967:	learn: 3075.5821261	total: 11.4s	remaining: 376ms
968:	learn: 3075.1207762	total: 11.4s	remaining: 364ms
969:	learn: 3074.5904223	total: 11.4s	remaining: 352ms
970:	learn: 3074.2375416	total: 11.4s	remaining: 340ms
971:	learn: 3073.9042769	total: 11.4s	remaining: 329ms
972:	learn: 3073.3650926	total: 11.4s	remaining: 317ms
973:	learn: 3072.8834888	total: 11.4s	remaining: 305ms
974:	learn: 3072.7135775	total: 11.4s	remaining: 294ms
975:	learn: 3072.2250441	total: 11.5s	remaining: 282ms
976:	learn: 3071.8049769	total: 11.5s	remaining: 270ms
977:	learn: 3071.4461943	total: 11.5s	remaining: 258ms
978:	learn: 3071.2487587	total: 11.5s	remaining: 246ms
979:	learn: 3070.8031997	total: 11.5s	remaining: 235ms
980:	learn: 3070.2129085	total: 11.5s	remaining: 223ms
981:	learn: 3069.8821515	total: 11.5s	remaining: 211ms
982:	learn: 3069.4042943	total: 11.5s	remaining: 199ms
983:	learn: 3069.1140203	total: 11.5s	remaining: 188ms
984:	learn: 3068.4594726	total: 11.6s	remaining: 176ms
985:	learn: 3067.9723238	total: 11.6s	remaining: 164ms
986:	learn: 3067.6802993	total: 11.6s	remaining: 152ms
987:	learn: 3067.3729318	total: 11.6s	remaining: 141ms
988:	learn: 3067.0558502	total: 11.6s	remaining: 129ms
989:	learn: 3066.5989151	total: 11.6s	remaining: 117ms
990:	learn: 3066.3448835	total: 11.6s	remaining: 105ms
991:	learn: 3066.0682778	total: 11.6s	remaining: 93.7ms
992:	learn: 3065.6099701	total: 11.6s	remaining: 82ms
993:	learn: 3065.3057616	total: 11.6s	remaining: 70.3ms
994:	learn: 3064.8084972	total: 11.7s	remaining: 58.6ms
995:	learn: 3064.4149929	total: 11.7s	remaining: 46.8ms
996:	learn: 3064.0328123	total: 11.7s	remaining: 35.1ms
997:	learn: 3063.6026329	total: 11.7s	remaining: 23.4ms
998:	learn: 3062.8170419	total: 11.7s	remaining: 11.7ms
999:	learn: 3062.3710004	total: 11.7s	remaining: 0us
0:	learn: 11595.1049411	total: 11.5ms	remaining: 11.5s
1:	learn: 11278.6197555	total: 22.2ms	remaining: 11.1s
2:	learn: 10975.6687816	total: 34.7ms	remaining: 11.5s
3:	learn: 10687.4401321	total: 45.6ms	remaining: 11.4s
4:	learn: 10404.6017496	total: 58.2ms	remaining: 11.6s
5:	learn: 10143.0129695	total: 70.5ms	remaining: 11.7s
6:	learn: 9879.0944915	total: 83.1ms	remaining: 11.8s
7:	learn: 9621.3330660	total: 95.3ms	remaining: 11.8s
8:	learn: 9372.9757043	total: 107ms	remaining: 11.8s
9:	learn: 9138.2061340	total: 119ms	remaining: 11.7s
10:	learn: 8912.7347450	total: 130ms	remaining: 11.7s
11:	learn: 8698.9910922	total: 143ms	remaining: 11.7s
12:	learn: 8484.4948864	total: 154ms	remaining: 11.7s
13:	learn: 8291.6458236	total: 167ms	remaining: 11.8s
14:	learn: 8094.1334109	total: 179ms	remaining: 11.8s
15:	learn: 7906.9457970	total: 191ms	remaining: 11.7s
16:	learn: 7731.7526269	total: 203ms	remaining: 11.7s
17:	learn: 7571.1986832	total: 225ms	remaining: 12.3s
18:	learn: 7408.4779245	total: 237ms	remaining: 12.2s
19:	learn: 7248.2482122	total: 246ms	remaining: 12.1s
20:	learn: 7103.3842302	total: 258ms	remaining: 12s
21:	learn: 6957.2510928	total: 267ms	remaining: 11.9s
22:	learn: 6820.1673725	total: 279ms	remaining: 11.9s
23:	learn: 6687.7928344	total: 291ms	remaining: 11.8s
24:	learn: 6561.3754438	total: 304ms	remaining: 11.9s
25:	learn: 6441.9227914	total: 315ms	remaining: 11.8s
26:	learn: 6323.7777329	total: 328ms	remaining: 11.8s
27:	learn: 6206.1358593	total: 340ms	remaining: 11.8s
28:	learn: 6096.3143714	total: 350ms	remaining: 11.7s
29:	learn: 5991.6261011	total: 360ms	remaining: 11.6s
30:	learn: 5899.6630497	total: 371ms	remaining: 11.6s
31:	learn: 5804.8959535	total: 381ms	remaining: 11.5s
32:	learn: 5716.6983110	total: 391ms	remaining: 11.4s
33:	learn: 5624.5858803	total: 402ms	remaining: 11.4s
34:	learn: 5552.8631396	total: 413ms	remaining: 11.4s
35:	learn: 5480.6552084	total: 433ms	remaining: 11.6s
36:	learn: 5403.3853700	total: 448ms	remaining: 11.7s
37:	learn: 5337.9287976	total: 458ms	remaining: 11.6s
38:	learn: 5270.9253912	total: 471ms	remaining: 11.6s
39:	learn: 5210.3920183	total: 489ms	remaining: 11.7s
40:	learn: 5151.8878120	total: 506ms	remaining: 11.8s
41:	learn: 5096.0685334	total: 518ms	remaining: 11.8s
42:	learn: 5039.7932565	total: 529ms	remaining: 11.8s
43:	learn: 4988.8140725	total: 541ms	remaining: 11.8s
44:	learn: 4942.1546379	total: 552ms	remaining: 11.7s
45:	learn: 4893.0059909	total: 562ms	remaining: 11.7s
46:	learn: 4853.1320406	total: 574ms	remaining: 11.6s
47:	learn: 4810.0361764	total: 586ms	remaining: 11.6s
48:	learn: 4769.8043911	total: 598ms	remaining: 11.6s
49:	learn: 4725.3959000	total: 610ms	remaining: 11.6s
50:	learn: 4683.7936623	total: 623ms	remaining: 11.6s
51:	learn: 4650.4801450	total: 638ms	remaining: 11.6s
52:	learn: 4617.6289626	total: 652ms	remaining: 11.7s
53:	learn: 4585.4373295	total: 662ms	remaining: 11.6s
54:	learn: 4552.6552866	total: 672ms	remaining: 11.6s
55:	learn: 4523.0324152	total: 685ms	remaining: 11.5s
56:	learn: 4492.9972473	total: 695ms	remaining: 11.5s
57:	learn: 4465.7540069	total: 705ms	remaining: 11.5s
58:	learn: 4438.0477520	total: 717ms	remaining: 11.4s
59:	learn: 4410.0967612	total: 728ms	remaining: 11.4s
60:	learn: 4383.2146781	total: 740ms	remaining: 11.4s
61:	learn: 4359.0718493	total: 753ms	remaining: 11.4s
62:	learn: 4336.7064335	total: 765ms	remaining: 11.4s
63:	learn: 4312.0601655	total: 776ms	remaining: 11.4s
64:	learn: 4288.8573550	total: 788ms	remaining: 11.3s
65:	learn: 4266.9232654	total: 799ms	remaining: 11.3s
66:	learn: 4248.5066248	total: 810ms	remaining: 11.3s
67:	learn: 4227.8697848	total: 822ms	remaining: 11.3s
68:	learn: 4206.8034466	total: 833ms	remaining: 11.2s
69:	learn: 4186.4144949	total: 861ms	remaining: 11.4s
70:	learn: 4166.9773224	total: 875ms	remaining: 11.4s
71:	learn: 4149.0274364	total: 886ms	remaining: 11.4s
72:	learn: 4131.0805328	total: 897ms	remaining: 11.4s
73:	learn: 4115.0300320	total: 909ms	remaining: 11.4s
74:	learn: 4098.6772561	total: 923ms	remaining: 11.4s
75:	learn: 4082.2843142	total: 934ms	remaining: 11.3s
76:	learn: 4067.6454343	total: 945ms	remaining: 11.3s
77:	learn: 4054.2493206	total: 958ms	remaining: 11.3s
78:	learn: 4040.1543063	total: 969ms	remaining: 11.3s
79:	learn: 4026.6441266	total: 988ms	remaining: 11.4s
80:	learn: 4014.6996399	total: 1.02s	remaining: 11.5s
81:	learn: 4003.4414328	total: 1.05s	remaining: 11.7s
82:	learn: 3990.4925938	total: 1.09s	remaining: 12s
83:	learn: 3978.2797911	total: 1.11s	remaining: 12.1s
84:	learn: 3968.0591622	total: 1.13s	remaining: 12.2s
85:	learn: 3956.9024133	total: 1.15s	remaining: 12.2s
86:	learn: 3945.8887486	total: 1.16s	remaining: 12.2s
87:	learn: 3936.6222637	total: 1.18s	remaining: 12.2s
88:	learn: 3927.3814446	total: 1.21s	remaining: 12.3s
89:	learn: 3918.3916515	total: 1.24s	remaining: 12.5s
90:	learn: 3909.5593064	total: 1.25s	remaining: 12.5s
91:	learn: 3900.9604050	total: 1.27s	remaining: 12.5s
92:	learn: 3892.5766714	total: 1.3s	remaining: 12.6s
93:	learn: 3885.5119334	total: 1.31s	remaining: 12.7s
94:	learn: 3876.7860054	total: 1.33s	remaining: 12.7s
95:	learn: 3869.2245745	total: 1.36s	remaining: 12.8s
96:	learn: 3861.9580481	total: 1.39s	remaining: 12.9s
97:	learn: 3855.5883044	total: 1.43s	remaining: 13.1s
98:	learn: 3847.3812153	total: 1.44s	remaining: 13.1s
99:	learn: 3840.5591786	total: 1.46s	remaining: 13.2s
100:	learn: 3833.6331263	total: 1.5s	remaining: 13.4s
101:	learn: 3827.4990731	total: 1.53s	remaining: 13.5s
102:	learn: 3821.4519510	total: 1.56s	remaining: 13.6s
103:	learn: 3816.3096771	total: 1.59s	remaining: 13.7s
104:	learn: 3810.4462490	total: 1.62s	remaining: 13.8s
105:	learn: 3804.9416506	total: 1.65s	remaining: 13.9s
106:	learn: 3799.9563316	total: 1.68s	remaining: 14s
107:	learn: 3794.4908384	total: 1.72s	remaining: 14.2s
108:	learn: 3788.9181846	total: 1.75s	remaining: 14.3s
109:	learn: 3784.3960824	total: 1.78s	remaining: 14.4s
110:	learn: 3780.3482113	total: 1.8s	remaining: 14.4s
111:	learn: 3775.9679754	total: 1.83s	remaining: 14.6s
112:	learn: 3770.8806233	total: 1.86s	remaining: 14.6s
113:	learn: 3766.2467239	total: 1.9s	remaining: 14.8s
114:	learn: 3761.9671023	total: 1.94s	remaining: 14.9s
115:	learn: 3758.4782646	total: 1.96s	remaining: 15s
116:	learn: 3754.2146944	total: 2s	remaining: 15.1s
117:	learn: 3749.6838052	total: 2.03s	remaining: 15.1s
118:	learn: 3745.7054868	total: 2.06s	remaining: 15.3s
119:	learn: 3740.7343248	total: 2.09s	remaining: 15.4s
120:	learn: 3737.6702417	total: 2.13s	remaining: 15.4s
121:	learn: 3734.0077479	total: 2.15s	remaining: 15.5s
122:	learn: 3730.2099154	total: 2.19s	remaining: 15.6s
123:	learn: 3726.7193949	total: 2.21s	remaining: 15.6s
124:	learn: 3723.2737950	total: 2.24s	remaining: 15.7s
125:	learn: 3719.1974818	total: 2.27s	remaining: 15.8s
126:	learn: 3714.8889465	total: 2.31s	remaining: 15.9s
127:	learn: 3711.8428562	total: 2.34s	remaining: 15.9s
128:	learn: 3708.3608635	total: 2.37s	remaining: 16s
129:	learn: 3704.3627764	total: 2.4s	remaining: 16.1s
130:	learn: 3701.2585756	total: 2.44s	remaining: 16.2s
131:	learn: 3697.8949771	total: 2.47s	remaining: 16.2s
132:	learn: 3695.3817457	total: 2.5s	remaining: 16.3s
133:	learn: 3691.2271440	total: 2.53s	remaining: 16.4s
134:	learn: 3688.1209387	total: 2.56s	remaining: 16.4s
135:	learn: 3685.5216475	total: 2.6s	remaining: 16.5s
136:	learn: 3682.7750325	total: 2.63s	remaining: 16.5s
137:	learn: 3680.2961356	total: 2.65s	remaining: 16.6s
138:	learn: 3677.3121043	total: 2.69s	remaining: 16.6s
139:	learn: 3674.5509280	total: 2.72s	remaining: 16.7s
140:	learn: 3672.0659703	total: 2.75s	remaining: 16.8s
141:	learn: 3669.9690070	total: 2.78s	remaining: 16.8s
142:	learn: 3667.7177306	total: 2.81s	remaining: 16.9s
143:	learn: 3665.1294369	total: 2.84s	remaining: 16.9s
144:	learn: 3662.3847161	total: 2.87s	remaining: 16.9s
145:	learn: 3659.8554864	total: 2.9s	remaining: 16.9s
146:	learn: 3657.2118655	total: 2.93s	remaining: 17s
147:	learn: 3654.9930221	total: 2.96s	remaining: 17s
148:	learn: 3652.4833363	total: 2.99s	remaining: 17.1s
149:	learn: 3650.4839509	total: 3.03s	remaining: 17.2s
150:	learn: 3648.0492351	total: 3.05s	remaining: 17.2s
151:	learn: 3645.9719948	total: 3.08s	remaining: 17.2s
152:	learn: 3644.1326878	total: 3.11s	remaining: 17.2s
153:	learn: 3642.0669284	total: 3.15s	remaining: 17.3s
154:	learn: 3640.0461953	total: 3.18s	remaining: 17.3s
155:	learn: 3638.1006557	total: 3.21s	remaining: 17.4s
156:	learn: 3635.8379626	total: 3.25s	remaining: 17.4s
157:	learn: 3633.6525879	total: 3.28s	remaining: 17.5s
158:	learn: 3633.3452808	total: 3.31s	remaining: 17.5s
159:	learn: 3632.1160020	total: 3.34s	remaining: 17.5s
160:	learn: 3630.7241798	total: 3.37s	remaining: 17.5s
161:	learn: 3629.4774681	total: 3.39s	remaining: 17.5s
162:	learn: 3627.9399361	total: 3.42s	remaining: 17.6s
163:	learn: 3626.4729510	total: 3.45s	remaining: 17.6s
164:	learn: 3625.0816320	total: 3.48s	remaining: 17.6s
165:	learn: 3622.7505577	total: 3.51s	remaining: 17.6s
166:	learn: 3620.9181633	total: 3.54s	remaining: 17.7s
167:	learn: 3619.9631532	total: 3.57s	remaining: 17.7s
168:	learn: 3618.0391661	total: 3.6s	remaining: 17.7s
169:	learn: 3615.9138736	total: 3.63s	remaining: 17.7s
170:	learn: 3614.4810823	total: 3.66s	remaining: 17.7s
171:	learn: 3612.8550423	total: 3.69s	remaining: 17.8s
172:	learn: 3611.2685166	total: 3.72s	remaining: 17.8s
173:	learn: 3609.7893812	total: 3.75s	remaining: 17.8s
174:	learn: 3608.6335175	total: 3.77s	remaining: 17.8s
175:	learn: 3606.5022477	total: 3.79s	remaining: 17.8s
176:	learn: 3604.7077764	total: 3.82s	remaining: 17.8s
177:	learn: 3603.2913751	total: 3.85s	remaining: 17.8s
178:	learn: 3601.7317502	total: 3.87s	remaining: 17.7s
179:	learn: 3599.8007537	total: 3.88s	remaining: 17.7s
180:	learn: 3598.8030232	total: 3.89s	remaining: 17.6s
181:	learn: 3597.6654996	total: 3.9s	remaining: 17.5s
182:	learn: 3595.4262850	total: 3.91s	remaining: 17.5s
183:	learn: 3593.8037981	total: 3.92s	remaining: 17.4s
184:	learn: 3592.1727583	total: 3.93s	remaining: 17.3s
185:	learn: 3590.3761940	total: 3.94s	remaining: 17.2s
186:	learn: 3588.7904713	total: 3.95s	remaining: 17.2s
187:	learn: 3587.5461356	total: 3.96s	remaining: 17.1s
188:	learn: 3585.7734610	total: 3.97s	remaining: 17s
189:	learn: 3584.1361686	total: 3.98s	remaining: 16.9s
190:	learn: 3582.6801010	total: 3.98s	remaining: 16.9s
191:	learn: 3581.9850867	total: 3.99s	remaining: 16.8s
192:	learn: 3580.4063781	total: 4s	remaining: 16.7s
193:	learn: 3578.5276293	total: 4.01s	remaining: 16.7s
194:	learn: 3577.2739440	total: 4.03s	remaining: 16.6s
195:	learn: 3575.8865546	total: 4.04s	remaining: 16.6s
196:	learn: 3574.3036299	total: 4.04s	remaining: 16.5s
197:	learn: 3573.1148498	total: 4.05s	remaining: 16.4s
198:	learn: 3571.5973336	total: 4.06s	remaining: 16.4s
199:	learn: 3569.9366799	total: 4.08s	remaining: 16.3s
200:	learn: 3569.5049495	total: 4.09s	remaining: 16.3s
201:	learn: 3567.9257700	total: 4.1s	remaining: 16.2s
202:	learn: 3566.9679524	total: 4.11s	remaining: 16.1s
203:	learn: 3565.4029151	total: 4.12s	remaining: 16.1s
204:	learn: 3564.3759420	total: 4.13s	remaining: 16s
205:	learn: 3563.2220712	total: 4.14s	remaining: 15.9s
206:	learn: 3561.7210501	total: 4.14s	remaining: 15.9s
207:	learn: 3560.0854644	total: 4.16s	remaining: 15.8s
208:	learn: 3558.5763164	total: 4.16s	remaining: 15.8s
209:	learn: 3557.2832530	total: 4.17s	remaining: 15.7s
210:	learn: 3556.3428135	total: 4.18s	remaining: 15.6s
211:	learn: 3555.1477230	total: 4.19s	remaining: 15.6s
212:	learn: 3554.0170799	total: 4.2s	remaining: 15.5s
213:	learn: 3552.6388722	total: 4.21s	remaining: 15.5s
214:	learn: 3551.0884339	total: 4.22s	remaining: 15.4s
215:	learn: 3549.8761931	total: 4.23s	remaining: 15.3s
216:	learn: 3548.6921995	total: 4.24s	remaining: 15.3s
217:	learn: 3546.7924364	total: 4.25s	remaining: 15.2s
218:	learn: 3545.6053280	total: 4.25s	remaining: 15.2s
219:	learn: 3544.7976471	total: 4.27s	remaining: 15.1s
220:	learn: 3543.4239221	total: 4.28s	remaining: 15.1s
221:	learn: 3541.2898896	total: 4.29s	remaining: 15s
222:	learn: 3539.7781542	total: 4.3s	remaining: 15s
223:	learn: 3538.8200956	total: 4.31s	remaining: 14.9s
224:	learn: 3537.0161341	total: 4.32s	remaining: 14.9s
225:	learn: 3535.7893708	total: 4.32s	remaining: 14.8s
226:	learn: 3534.6177666	total: 4.33s	remaining: 14.8s
227:	learn: 3533.5534147	total: 4.34s	remaining: 14.7s
228:	learn: 3531.2213574	total: 4.36s	remaining: 14.7s
229:	learn: 3529.9284778	total: 4.36s	remaining: 14.6s
230:	learn: 3529.2338597	total: 4.37s	remaining: 14.6s
231:	learn: 3528.2527235	total: 4.38s	remaining: 14.5s
232:	learn: 3527.1218113	total: 4.39s	remaining: 14.5s
233:	learn: 3526.1160368	total: 4.4s	remaining: 14.4s
234:	learn: 3525.0594440	total: 4.41s	remaining: 14.4s
235:	learn: 3523.3062230	total: 4.42s	remaining: 14.3s
236:	learn: 3522.3292907	total: 4.43s	remaining: 14.3s
237:	learn: 3521.4547930	total: 4.44s	remaining: 14.2s
238:	learn: 3520.7230370	total: 4.45s	remaining: 14.2s
239:	learn: 3519.3544409	total: 4.46s	remaining: 14.1s
240:	learn: 3518.4883139	total: 4.47s	remaining: 14.1s
241:	learn: 3517.1267134	total: 4.48s	remaining: 14s
242:	learn: 3516.3364572	total: 4.49s	remaining: 14s
243:	learn: 3515.1104513	total: 4.5s	remaining: 13.9s
244:	learn: 3513.4742100	total: 4.51s	remaining: 13.9s
245:	learn: 3512.4324392	total: 4.52s	remaining: 13.8s
246:	learn: 3511.0069134	total: 4.53s	remaining: 13.8s
247:	learn: 3509.4835930	total: 4.54s	remaining: 13.8s
248:	learn: 3508.8899978	total: 4.55s	remaining: 13.7s
249:	learn: 3508.1806627	total: 4.57s	remaining: 13.7s
250:	learn: 3506.6244037	total: 4.57s	remaining: 13.7s
251:	learn: 3505.2430543	total: 4.58s	remaining: 13.6s
252:	learn: 3504.4497731	total: 4.59s	remaining: 13.6s
253:	learn: 3503.4469141	total: 4.6s	remaining: 13.5s
254:	learn: 3503.0257673	total: 4.61s	remaining: 13.5s
255:	learn: 3501.8871706	total: 4.62s	remaining: 13.4s
256:	learn: 3501.0959483	total: 4.63s	remaining: 13.4s
257:	learn: 3500.2085126	total: 4.64s	remaining: 13.3s
258:	learn: 3498.4088626	total: 4.65s	remaining: 13.3s
259:	learn: 3497.8599439	total: 4.65s	remaining: 13.2s
260:	learn: 3497.1621282	total: 4.66s	remaining: 13.2s
261:	learn: 3496.4053176	total: 4.69s	remaining: 13.2s
262:	learn: 3495.0980035	total: 4.7s	remaining: 13.2s
263:	learn: 3494.1729622	total: 4.71s	remaining: 13.1s
264:	learn: 3493.3733546	total: 4.72s	remaining: 13.1s
265:	learn: 3492.7397499	total: 4.72s	remaining: 13s
266:	learn: 3491.9798197	total: 4.73s	remaining: 13s
267:	learn: 3491.4206159	total: 4.74s	remaining: 13s
268:	learn: 3490.8487174	total: 4.75s	remaining: 12.9s
269:	learn: 3490.4425719	total: 4.76s	remaining: 12.9s
270:	learn: 3489.4488187	total: 4.77s	remaining: 12.8s
271:	learn: 3489.0286174	total: 4.77s	remaining: 12.8s
272:	learn: 3488.2720245	total: 4.78s	remaining: 12.7s
273:	learn: 3487.5480346	total: 4.79s	remaining: 12.7s
274:	learn: 3486.7218397	total: 4.8s	remaining: 12.7s
275:	learn: 3485.5135663	total: 4.81s	remaining: 12.6s
276:	learn: 3485.0219613	total: 4.82s	remaining: 12.6s
277:	learn: 3484.2137038	total: 4.83s	remaining: 12.5s
278:	learn: 3482.9868190	total: 4.84s	remaining: 12.5s
279:	learn: 3482.0452947	total: 4.85s	remaining: 12.5s
280:	learn: 3480.8154493	total: 4.86s	remaining: 12.4s
281:	learn: 3480.1567367	total: 4.87s	remaining: 12.4s
282:	learn: 3479.1600374	total: 4.88s	remaining: 12.4s
283:	learn: 3478.2725468	total: 4.89s	remaining: 12.3s
284:	learn: 3477.3773958	total: 4.9s	remaining: 12.3s
285:	learn: 3476.3353277	total: 4.91s	remaining: 12.3s
286:	learn: 3475.8442896	total: 4.92s	remaining: 12.2s
287:	learn: 3475.2481561	total: 4.93s	remaining: 12.2s
288:	learn: 3474.4129221	total: 4.93s	remaining: 12.1s
289:	learn: 3473.6496628	total: 4.94s	remaining: 12.1s
290:	learn: 3472.3730562	total: 4.95s	remaining: 12.1s
291:	learn: 3471.3530914	total: 4.96s	remaining: 12s
292:	learn: 3470.7342228	total: 4.98s	remaining: 12s
293:	learn: 3469.9961785	total: 4.99s	remaining: 12s
294:	learn: 3469.5148009	total: 5s	remaining: 11.9s
295:	learn: 3468.7102064	total: 5s	remaining: 11.9s
296:	learn: 3468.1653668	total: 5.01s	remaining: 11.9s
297:	learn: 3467.5788435	total: 5.02s	remaining: 11.8s
298:	learn: 3466.6297185	total: 5.03s	remaining: 11.8s
299:	learn: 3465.9470314	total: 5.04s	remaining: 11.8s
300:	learn: 3464.8131759	total: 5.05s	remaining: 11.7s
301:	learn: 3463.7423191	total: 5.06s	remaining: 11.7s
302:	learn: 3462.6711649	total: 5.08s	remaining: 11.7s
303:	learn: 3461.7972960	total: 5.09s	remaining: 11.6s
304:	learn: 3460.9888895	total: 5.1s	remaining: 11.6s
305:	learn: 3460.1838614	total: 5.11s	remaining: 11.6s
306:	learn: 3459.5671611	total: 5.11s	remaining: 11.5s
307:	learn: 3458.8160594	total: 5.12s	remaining: 11.5s
308:	learn: 3457.9429248	total: 5.13s	remaining: 11.5s
309:	learn: 3456.6205083	total: 5.14s	remaining: 11.4s
310:	learn: 3456.3022446	total: 5.15s	remaining: 11.4s
311:	learn: 3455.7405432	total: 5.16s	remaining: 11.4s
312:	learn: 3454.7340739	total: 5.17s	remaining: 11.3s
313:	learn: 3453.9521941	total: 5.18s	remaining: 11.3s
314:	learn: 3453.5221106	total: 5.18s	remaining: 11.3s
315:	learn: 3452.2502763	total: 5.19s	remaining: 11.2s
316:	learn: 3451.5797316	total: 5.2s	remaining: 11.2s
317:	learn: 3450.7055441	total: 5.21s	remaining: 11.2s
318:	learn: 3449.7194496	total: 5.22s	remaining: 11.1s
319:	learn: 3448.7897699	total: 5.23s	remaining: 11.1s
320:	learn: 3448.1905003	total: 5.24s	remaining: 11.1s
321:	learn: 3447.2282617	total: 5.25s	remaining: 11s
322:	learn: 3446.3904787	total: 5.26s	remaining: 11s
323:	learn: 3446.1291125	total: 5.26s	remaining: 11s
324:	learn: 3444.9958514	total: 5.28s	remaining: 11s
325:	learn: 3444.4764609	total: 5.3s	remaining: 11s
326:	learn: 3444.0971394	total: 5.31s	remaining: 10.9s
327:	learn: 3442.9862992	total: 5.32s	remaining: 10.9s
328:	learn: 3442.4569229	total: 5.33s	remaining: 10.9s
329:	learn: 3441.2417091	total: 5.34s	remaining: 10.8s
330:	learn: 3440.4467856	total: 5.34s	remaining: 10.8s
331:	learn: 3439.7734711	total: 5.35s	remaining: 10.8s
332:	learn: 3439.3109582	total: 5.36s	remaining: 10.7s
333:	learn: 3438.6658632	total: 5.37s	remaining: 10.7s
334:	learn: 3437.8247875	total: 5.38s	remaining: 10.7s
335:	learn: 3437.1769390	total: 5.39s	remaining: 10.6s
336:	learn: 3436.3832254	total: 5.4s	remaining: 10.6s
337:	learn: 3435.5219546	total: 5.41s	remaining: 10.6s
338:	learn: 3434.4462333	total: 5.42s	remaining: 10.6s
339:	learn: 3433.3878146	total: 5.42s	remaining: 10.5s
340:	learn: 3432.2932633	total: 5.43s	remaining: 10.5s
341:	learn: 3431.0812069	total: 5.44s	remaining: 10.5s
342:	learn: 3430.2206545	total: 5.45s	remaining: 10.4s
343:	learn: 3429.5112909	total: 5.46s	remaining: 10.4s
344:	learn: 3429.1053471	total: 5.47s	remaining: 10.4s
345:	learn: 3428.2314415	total: 5.48s	remaining: 10.4s
346:	learn: 3427.5926731	total: 5.49s	remaining: 10.3s
347:	learn: 3427.1390042	total: 5.5s	remaining: 10.3s
348:	learn: 3426.2210567	total: 5.51s	remaining: 10.3s
349:	learn: 3425.6164317	total: 5.52s	remaining: 10.2s
350:	learn: 3423.7548960	total: 5.53s	remaining: 10.2s
351:	learn: 3423.0090933	total: 5.55s	remaining: 10.2s
352:	learn: 3421.8643733	total: 5.56s	remaining: 10.2s
353:	learn: 3421.0415543	total: 5.57s	remaining: 10.2s
354:	learn: 3420.2277093	total: 5.58s	remaining: 10.1s
355:	learn: 3419.6062133	total: 5.58s	remaining: 10.1s
356:	learn: 3418.8278732	total: 5.59s	remaining: 10.1s
357:	learn: 3417.8612158	total: 5.6s	remaining: 10s
358:	learn: 3416.8820960	total: 5.61s	remaining: 10s
359:	learn: 3416.1262173	total: 5.62s	remaining: 9.99s
360:	learn: 3415.7175288	total: 5.63s	remaining: 9.97s
361:	learn: 3414.7423738	total: 5.64s	remaining: 9.95s
362:	learn: 3413.6198607	total: 5.65s	remaining: 9.92s
363:	learn: 3413.0571249	total: 5.66s	remaining: 9.9s
364:	learn: 3412.6630748	total: 5.67s	remaining: 9.87s
365:	learn: 3412.1963596	total: 5.68s	remaining: 9.84s
366:	learn: 3411.8064152	total: 5.7s	remaining: 9.82s
367:	learn: 3411.0948950	total: 5.7s	remaining: 9.8s
368:	learn: 3410.1574465	total: 5.71s	remaining: 9.77s
369:	learn: 3409.1653869	total: 5.72s	remaining: 9.74s
370:	learn: 3408.5275824	total: 5.73s	remaining: 9.72s
371:	learn: 3408.2281272	total: 5.74s	remaining: 9.69s
372:	learn: 3407.2116903	total: 5.75s	remaining: 9.66s
373:	learn: 3406.4251423	total: 5.76s	remaining: 9.64s
374:	learn: 3405.2988334	total: 5.77s	remaining: 9.62s
375:	learn: 3404.3918491	total: 5.78s	remaining: 9.59s
376:	learn: 3403.7672322	total: 5.79s	remaining: 9.56s
377:	learn: 3403.2044418	total: 5.8s	remaining: 9.54s
378:	learn: 3402.5483952	total: 5.8s	remaining: 9.51s
379:	learn: 3401.5692261	total: 5.81s	remaining: 9.48s
380:	learn: 3400.6787394	total: 5.82s	remaining: 9.46s
381:	learn: 3399.8537017	total: 5.83s	remaining: 9.43s
382:	learn: 3399.2713258	total: 5.84s	remaining: 9.41s
383:	learn: 3398.6051182	total: 5.85s	remaining: 9.38s
384:	learn: 3397.6933940	total: 5.86s	remaining: 9.36s
385:	learn: 3396.7262684	total: 5.87s	remaining: 9.33s
386:	learn: 3395.6992207	total: 5.87s	remaining: 9.3s
387:	learn: 3394.9577266	total: 5.89s	remaining: 9.29s
388:	learn: 3394.0107622	total: 5.91s	remaining: 9.29s
389:	learn: 3393.3587414	total: 5.92s	remaining: 9.26s
390:	learn: 3392.6978584	total: 5.93s	remaining: 9.24s
391:	learn: 3392.0736702	total: 5.94s	remaining: 9.21s
392:	learn: 3390.9061810	total: 5.95s	remaining: 9.19s
393:	learn: 3390.0993334	total: 5.96s	remaining: 9.16s
394:	learn: 3389.4291193	total: 5.96s	remaining: 9.13s
395:	learn: 3388.8095467	total: 5.97s	remaining: 9.11s
396:	learn: 3388.5260096	total: 5.98s	remaining: 9.09s
397:	learn: 3387.7822356	total: 5.99s	remaining: 9.06s
398:	learn: 3386.6966408	total: 6s	remaining: 9.04s
399:	learn: 3385.8230466	total: 6.01s	remaining: 9.01s
400:	learn: 3384.9765056	total: 6.02s	remaining: 8.99s
401:	learn: 3384.4905911	total: 6.03s	remaining: 8.96s
402:	learn: 3383.7824174	total: 6.04s	remaining: 8.94s
403:	learn: 3382.8299454	total: 6.04s	remaining: 8.92s
404:	learn: 3382.2286171	total: 6.05s	remaining: 8.89s
405:	learn: 3381.2799846	total: 6.06s	remaining: 8.87s
406:	learn: 3380.6609285	total: 6.07s	remaining: 8.85s
407:	learn: 3379.8941841	total: 6.08s	remaining: 8.83s
408:	learn: 3379.4919279	total: 6.1s	remaining: 8.82s
409:	learn: 3378.7534460	total: 6.11s	remaining: 8.8s
410:	learn: 3378.1224554	total: 6.12s	remaining: 8.77s
411:	learn: 3377.2863302	total: 6.13s	remaining: 8.75s
412:	learn: 3376.8151424	total: 6.14s	remaining: 8.73s
413:	learn: 3375.7088259	total: 6.15s	remaining: 8.71s
414:	learn: 3375.1109763	total: 6.16s	remaining: 8.68s
415:	learn: 3374.2542628	total: 6.17s	remaining: 8.66s
416:	learn: 3373.8453973	total: 6.18s	remaining: 8.63s
417:	learn: 3372.7800102	total: 6.18s	remaining: 8.61s
418:	learn: 3372.3110093	total: 6.19s	remaining: 8.59s
419:	learn: 3371.1310954	total: 6.2s	remaining: 8.57s
420:	learn: 3370.7908617	total: 6.21s	remaining: 8.54s
421:	learn: 3369.8206660	total: 6.22s	remaining: 8.52s
422:	learn: 3369.3341042	total: 6.23s	remaining: 8.5s
423:	learn: 3368.8547670	total: 6.24s	remaining: 8.47s
424:	learn: 3368.5581281	total: 6.24s	remaining: 8.45s
425:	learn: 3368.0535207	total: 6.25s	remaining: 8.43s
426:	learn: 3367.4862304	total: 6.26s	remaining: 8.4s
427:	learn: 3367.1684210	total: 6.27s	remaining: 8.38s
428:	learn: 3366.5577981	total: 6.28s	remaining: 8.36s
429:	learn: 3365.2951895	total: 6.29s	remaining: 8.34s
430:	learn: 3364.8734152	total: 6.3s	remaining: 8.32s
431:	learn: 3364.3478475	total: 6.32s	remaining: 8.31s
432:	learn: 3364.1107751	total: 6.33s	remaining: 8.29s
433:	learn: 3363.5738974	total: 6.34s	remaining: 8.27s
434:	learn: 3363.0437343	total: 6.35s	remaining: 8.24s
435:	learn: 3362.4529362	total: 6.36s	remaining: 8.22s
436:	learn: 3361.7879468	total: 6.36s	remaining: 8.2s
437:	learn: 3361.3560351	total: 6.37s	remaining: 8.18s
438:	learn: 3360.5900418	total: 6.38s	remaining: 8.15s
439:	learn: 3359.9072200	total: 6.39s	remaining: 8.13s
440:	learn: 3359.2313691	total: 6.4s	remaining: 8.11s
441:	learn: 3358.7659459	total: 6.41s	remaining: 8.09s
442:	learn: 3358.1365211	total: 6.42s	remaining: 8.07s
443:	learn: 3357.6213824	total: 6.43s	remaining: 8.05s
444:	learn: 3357.1712462	total: 6.44s	remaining: 8.03s
445:	learn: 3356.6434789	total: 6.45s	remaining: 8.01s
446:	learn: 3355.9809739	total: 6.46s	remaining: 7.99s
447:	learn: 3355.3797167	total: 6.47s	remaining: 7.97s
448:	learn: 3355.0611969	total: 6.48s	remaining: 7.95s
449:	learn: 3354.2912214	total: 6.49s	remaining: 7.93s
450:	learn: 3353.6893255	total: 6.51s	remaining: 7.92s
451:	learn: 3352.8130732	total: 6.53s	remaining: 7.91s
452:	learn: 3352.2295752	total: 6.54s	remaining: 7.89s
453:	learn: 3351.6296872	total: 6.55s	remaining: 7.87s
454:	learn: 3351.2384132	total: 6.55s	remaining: 7.85s
455:	learn: 3350.4663364	total: 6.56s	remaining: 7.83s
456:	learn: 3350.1390601	total: 6.57s	remaining: 7.81s
457:	learn: 3349.6344768	total: 6.58s	remaining: 7.79s
458:	learn: 3348.8319176	total: 6.59s	remaining: 7.77s
459:	learn: 3348.3610477	total: 6.6s	remaining: 7.75s
460:	learn: 3347.0284315	total: 6.61s	remaining: 7.73s
461:	learn: 3346.5505071	total: 6.62s	remaining: 7.71s
462:	learn: 3346.1127756	total: 6.63s	remaining: 7.68s
463:	learn: 3345.2862509	total: 6.63s	remaining: 7.66s
464:	learn: 3344.5023153	total: 6.64s	remaining: 7.64s
465:	learn: 3343.8755840	total: 6.65s	remaining: 7.62s
466:	learn: 3343.4676155	total: 6.66s	remaining: 7.6s
467:	learn: 3343.1170531	total: 6.67s	remaining: 7.58s
468:	learn: 3342.0993551	total: 6.68s	remaining: 7.56s
469:	learn: 3341.3563209	total: 6.69s	remaining: 7.54s
470:	learn: 3340.9867911	total: 6.7s	remaining: 7.52s
471:	learn: 3340.4440360	total: 6.72s	remaining: 7.52s
472:	learn: 3339.8393425	total: 6.73s	remaining: 7.5s
473:	learn: 3339.2993775	total: 6.74s	remaining: 7.48s
474:	learn: 3338.8972177	total: 6.75s	remaining: 7.46s
475:	learn: 3337.3891287	total: 6.76s	remaining: 7.44s
476:	learn: 3336.7396651	total: 6.77s	remaining: 7.42s
477:	learn: 3336.2139230	total: 6.78s	remaining: 7.4s
478:	learn: 3335.7164482	total: 6.79s	remaining: 7.38s
479:	learn: 3335.3387946	total: 6.79s	remaining: 7.36s
480:	learn: 3334.3867241	total: 6.8s	remaining: 7.34s
481:	learn: 3333.9095717	total: 6.81s	remaining: 7.32s
482:	learn: 3333.5196951	total: 6.82s	remaining: 7.3s
483:	learn: 3332.7500651	total: 6.83s	remaining: 7.28s
484:	learn: 3332.3381842	total: 6.84s	remaining: 7.26s
485:	learn: 3331.7659605	total: 6.85s	remaining: 7.24s
486:	learn: 3330.8712447	total: 6.86s	remaining: 7.22s
487:	learn: 3330.4925566	total: 6.87s	remaining: 7.2s
488:	learn: 3329.2291324	total: 6.88s	remaining: 7.18s
489:	learn: 3328.6098006	total: 6.88s	remaining: 7.17s
490:	learn: 3328.3964809	total: 6.89s	remaining: 7.15s
491:	learn: 3327.8248728	total: 6.91s	remaining: 7.14s
492:	learn: 3327.0842860	total: 6.93s	remaining: 7.13s
493:	learn: 3326.7191529	total: 6.94s	remaining: 7.11s
494:	learn: 3326.1819283	total: 6.95s	remaining: 7.09s
495:	learn: 3325.9172464	total: 6.96s	remaining: 7.07s
496:	learn: 3325.1082499	total: 6.97s	remaining: 7.05s
497:	learn: 3324.3252793	total: 6.98s	remaining: 7.03s
498:	learn: 3323.7856550	total: 6.99s	remaining: 7.01s
499:	learn: 3323.2038492	total: 7s	remaining: 7s
500:	learn: 3322.7837120	total: 7s	remaining: 6.98s
501:	learn: 3321.9624501	total: 7.01s	remaining: 6.96s
502:	learn: 3321.3400133	total: 7.02s	remaining: 6.94s
503:	learn: 3320.3000126	total: 7.04s	remaining: 6.93s
504:	learn: 3319.9641326	total: 7.05s	remaining: 6.92s
505:	learn: 3319.5977304	total: 7.06s	remaining: 6.9s
506:	learn: 3319.0305410	total: 7.07s	remaining: 6.88s
507:	learn: 3318.7994951	total: 7.08s	remaining: 6.86s
508:	learn: 3317.6577038	total: 7.09s	remaining: 6.84s
509:	learn: 3317.2224366	total: 7.1s	remaining: 6.82s
510:	learn: 3317.0000830	total: 7.11s	remaining: 6.8s
511:	learn: 3316.5395269	total: 7.12s	remaining: 6.79s
512:	learn: 3315.6690210	total: 7.14s	remaining: 6.78s
513:	learn: 3314.7160023	total: 7.15s	remaining: 6.76s
514:	learn: 3314.0723273	total: 7.16s	remaining: 6.74s
515:	learn: 3313.5807664	total: 7.16s	remaining: 6.72s
516:	learn: 3312.6860697	total: 7.17s	remaining: 6.7s
517:	learn: 3312.2983035	total: 7.18s	remaining: 6.68s
518:	learn: 3311.7126278	total: 7.19s	remaining: 6.67s
519:	learn: 3311.4775242	total: 7.2s	remaining: 6.65s
520:	learn: 3310.7748564	total: 7.21s	remaining: 6.63s
521:	learn: 3310.3865968	total: 7.22s	remaining: 6.61s
522:	learn: 3309.9464883	total: 7.23s	remaining: 6.59s
523:	learn: 3309.1980581	total: 7.24s	remaining: 6.58s
524:	learn: 3308.4570867	total: 7.25s	remaining: 6.56s
525:	learn: 3307.9893034	total: 7.26s	remaining: 6.54s
526:	learn: 3307.2693371	total: 7.27s	remaining: 6.52s
527:	learn: 3306.5656531	total: 7.28s	remaining: 6.5s
528:	learn: 3306.2445306	total: 7.29s	remaining: 6.49s
529:	learn: 3305.8053269	total: 7.3s	remaining: 6.47s
530:	learn: 3305.4339402	total: 7.31s	remaining: 6.45s
531:	learn: 3304.7218378	total: 7.32s	remaining: 6.43s
532:	learn: 3303.9393745	total: 7.33s	remaining: 6.42s
533:	learn: 3303.2552197	total: 7.34s	remaining: 6.41s
534:	learn: 3302.8115089	total: 7.35s	remaining: 6.39s
535:	learn: 3302.1705067	total: 7.36s	remaining: 6.37s
536:	learn: 3301.9017067	total: 7.37s	remaining: 6.35s
537:	learn: 3301.4339549	total: 7.38s	remaining: 6.33s
538:	learn: 3300.8821868	total: 7.38s	remaining: 6.32s
539:	learn: 3300.4462001	total: 7.39s	remaining: 6.3s
540:	learn: 3300.1724997	total: 7.4s	remaining: 6.28s
541:	learn: 3299.6588396	total: 7.41s	remaining: 6.26s
542:	learn: 3299.0443007	total: 7.42s	remaining: 6.24s
543:	learn: 3298.6599930	total: 7.43s	remaining: 6.23s
544:	learn: 3297.7319857	total: 7.44s	remaining: 6.21s
545:	learn: 3297.3516479	total: 7.45s	remaining: 6.19s
546:	learn: 3296.6981288	total: 7.46s	remaining: 6.18s
547:	learn: 3296.4577095	total: 7.47s	remaining: 6.16s
548:	learn: 3296.1807026	total: 7.48s	remaining: 6.15s
549:	learn: 3295.6070599	total: 7.49s	remaining: 6.13s
550:	learn: 3295.0878130	total: 7.51s	remaining: 6.12s
551:	learn: 3294.4770313	total: 7.52s	remaining: 6.1s
552:	learn: 3293.7328998	total: 7.53s	remaining: 6.08s
553:	learn: 3293.1348712	total: 7.54s	remaining: 6.07s
554:	learn: 3292.3018264	total: 7.55s	remaining: 6.05s
555:	learn: 3291.7065321	total: 7.56s	remaining: 6.04s
556:	learn: 3291.2297085	total: 7.57s	remaining: 6.02s
557:	learn: 3290.7032635	total: 7.58s	remaining: 6s
558:	learn: 3290.2481857	total: 7.59s	remaining: 5.99s
559:	learn: 3289.7432003	total: 7.6s	remaining: 5.97s
560:	learn: 3289.2722360	total: 7.61s	remaining: 5.95s
561:	learn: 3288.5352822	total: 7.62s	remaining: 5.94s
562:	learn: 3287.8361564	total: 7.63s	remaining: 5.92s
563:	learn: 3287.3171635	total: 7.63s	remaining: 5.9s
564:	learn: 3287.1161563	total: 7.64s	remaining: 5.88s
565:	learn: 3286.0193440	total: 7.65s	remaining: 5.87s
566:	learn: 3285.3309617	total: 7.66s	remaining: 5.85s
567:	learn: 3284.7246524	total: 7.67s	remaining: 5.83s
568:	learn: 3284.2499085	total: 7.68s	remaining: 5.82s
569:	learn: 3283.7873690	total: 7.69s	remaining: 5.8s
570:	learn: 3283.4681883	total: 7.7s	remaining: 5.78s
571:	learn: 3282.9667673	total: 7.71s	remaining: 5.77s
572:	learn: 3282.1437251	total: 7.72s	remaining: 5.75s
573:	learn: 3281.1672652	total: 7.73s	remaining: 5.73s
574:	learn: 3280.7187682	total: 7.75s	remaining: 5.72s
575:	learn: 3279.9231597	total: 7.76s	remaining: 5.71s
576:	learn: 3279.6013002	total: 7.76s	remaining: 5.69s
577:	learn: 3279.1260855	total: 7.77s	remaining: 5.67s
578:	learn: 3278.7884550	total: 7.78s	remaining: 5.66s
579:	learn: 3278.3339962	total: 7.79s	remaining: 5.64s
580:	learn: 3277.8550416	total: 7.8s	remaining: 5.62s
581:	learn: 3276.8697765	total: 7.81s	remaining: 5.61s
582:	learn: 3276.3275201	total: 7.82s	remaining: 5.59s
583:	learn: 3275.8602360	total: 7.83s	remaining: 5.57s
584:	learn: 3275.6350414	total: 7.83s	remaining: 5.56s
585:	learn: 3275.3883231	total: 7.84s	remaining: 5.54s
586:	learn: 3274.8986348	total: 7.85s	remaining: 5.52s
587:	learn: 3274.4847850	total: 7.86s	remaining: 5.51s
588:	learn: 3274.0230790	total: 7.87s	remaining: 5.49s
589:	learn: 3273.3701505	total: 7.88s	remaining: 5.48s
590:	learn: 3272.9862796	total: 7.89s	remaining: 5.46s
591:	learn: 3272.4374941	total: 7.9s	remaining: 5.44s
592:	learn: 3271.6990451	total: 7.91s	remaining: 5.43s
593:	learn: 3271.3432117	total: 7.92s	remaining: 5.41s
594:	learn: 3271.0447926	total: 7.93s	remaining: 5.39s
595:	learn: 3270.3371400	total: 7.93s	remaining: 5.38s
596:	learn: 3270.0702591	total: 7.94s	remaining: 5.36s
597:	learn: 3269.7340978	total: 7.96s	remaining: 5.35s
598:	learn: 3269.0306917	total: 7.97s	remaining: 5.33s
599:	learn: 3268.6500448	total: 7.98s	remaining: 5.32s
600:	learn: 3267.8855945	total: 7.99s	remaining: 5.3s
601:	learn: 3267.4748606	total: 8s	remaining: 5.29s
602:	learn: 3267.1052620	total: 8s	remaining: 5.27s
603:	learn: 3266.7201612	total: 8.01s	remaining: 5.25s
604:	learn: 3266.3659261	total: 8.02s	remaining: 5.24s
605:	learn: 3265.6290161	total: 8.03s	remaining: 5.22s
606:	learn: 3265.1447929	total: 8.04s	remaining: 5.21s
607:	learn: 3264.8096156	total: 8.05s	remaining: 5.19s
608:	learn: 3264.1230264	total: 8.06s	remaining: 5.17s
609:	learn: 3263.6524478	total: 8.07s	remaining: 5.16s
610:	learn: 3262.6595510	total: 8.08s	remaining: 5.14s
611:	learn: 3262.3493769	total: 8.09s	remaining: 5.13s
612:	learn: 3262.0599514	total: 8.1s	remaining: 5.11s
613:	learn: 3261.4099661	total: 8.1s	remaining: 5.09s
614:	learn: 3261.0858115	total: 8.11s	remaining: 5.08s
615:	learn: 3260.7810922	total: 8.12s	remaining: 5.06s
616:	learn: 3260.3521865	total: 8.13s	remaining: 5.05s
617:	learn: 3260.1143426	total: 8.14s	remaining: 5.03s
618:	learn: 3259.4793689	total: 8.16s	remaining: 5.02s
619:	learn: 3259.2001451	total: 8.17s	remaining: 5s
620:	learn: 3258.8740554	total: 8.18s	remaining: 4.99s
621:	learn: 3258.5250445	total: 8.18s	remaining: 4.97s
622:	learn: 3257.7248355	total: 8.19s	remaining: 4.96s
623:	learn: 3257.5809215	total: 8.2s	remaining: 4.94s
624:	learn: 3256.3674549	total: 8.21s	remaining: 4.92s
625:	learn: 3256.2075776	total: 8.22s	remaining: 4.91s
626:	learn: 3255.7837991	total: 8.23s	remaining: 4.89s
627:	learn: 3255.3294827	total: 8.23s	remaining: 4.88s
628:	learn: 3254.7753256	total: 8.24s	remaining: 4.86s
629:	learn: 3254.4740235	total: 8.25s	remaining: 4.85s
630:	learn: 3254.0753523	total: 8.26s	remaining: 4.83s
631:	learn: 3253.7815606	total: 8.27s	remaining: 4.81s
632:	learn: 3253.3515099	total: 8.28s	remaining: 4.8s
633:	learn: 3252.2614739	total: 8.29s	remaining: 4.78s
634:	learn: 3251.7037495	total: 8.29s	remaining: 4.77s
635:	learn: 3251.0927448	total: 8.3s	remaining: 4.75s
636:	learn: 3250.3517157	total: 8.31s	remaining: 4.74s
637:	learn: 3249.5624871	total: 8.32s	remaining: 4.72s
638:	learn: 3249.3825680	total: 8.33s	remaining: 4.71s
639:	learn: 3248.4519519	total: 8.34s	remaining: 4.69s
640:	learn: 3247.8042131	total: 8.35s	remaining: 4.67s
641:	learn: 3247.2801698	total: 8.36s	remaining: 4.66s
642:	learn: 3246.4875752	total: 8.38s	remaining: 4.65s
643:	learn: 3246.1248940	total: 8.38s	remaining: 4.63s
644:	learn: 3245.8025976	total: 8.39s	remaining: 4.62s
645:	learn: 3245.6130189	total: 8.4s	remaining: 4.6s
646:	learn: 3244.7476560	total: 8.41s	remaining: 4.59s
647:	learn: 3243.7874653	total: 8.42s	remaining: 4.57s
648:	learn: 3243.0286589	total: 8.43s	remaining: 4.56s
649:	learn: 3242.2975424	total: 8.44s	remaining: 4.54s
650:	learn: 3241.9605658	total: 8.45s	remaining: 4.53s
651:	learn: 3241.5101156	total: 8.46s	remaining: 4.51s
652:	learn: 3240.9650119	total: 8.46s	remaining: 4.5s
653:	learn: 3240.5025527	total: 8.48s	remaining: 4.48s
654:	learn: 3240.3027523	total: 8.5s	remaining: 4.47s
655:	learn: 3239.9650333	total: 8.51s	remaining: 4.46s
656:	learn: 3239.6534415	total: 8.51s	remaining: 4.44s
657:	learn: 3239.1780478	total: 8.52s	remaining: 4.43s
658:	learn: 3238.8949865	total: 8.53s	remaining: 4.41s
659:	learn: 3238.1783722	total: 8.54s	remaining: 4.4s
660:	learn: 3237.3789416	total: 8.55s	remaining: 4.38s
661:	learn: 3236.8459157	total: 8.56s	remaining: 4.37s
662:	learn: 3236.4256057	total: 8.57s	remaining: 4.36s
663:	learn: 3236.1058607	total: 8.58s	remaining: 4.34s
664:	learn: 3235.3441753	total: 8.59s	remaining: 4.33s
665:	learn: 3234.9341982	total: 8.6s	remaining: 4.31s
666:	learn: 3234.3586867	total: 8.61s	remaining: 4.3s
667:	learn: 3233.8981665	total: 8.62s	remaining: 4.28s
668:	learn: 3233.5987833	total: 8.63s	remaining: 4.27s
669:	learn: 3233.0206120	total: 8.64s	remaining: 4.25s
670:	learn: 3232.4076473	total: 8.65s	remaining: 4.24s
671:	learn: 3232.0588747	total: 8.66s	remaining: 4.22s
672:	learn: 3231.4974850	total: 8.66s	remaining: 4.21s
673:	learn: 3231.0448185	total: 8.67s	remaining: 4.2s
674:	learn: 3230.6529561	total: 8.68s	remaining: 4.18s
675:	learn: 3230.2210408	total: 8.69s	remaining: 4.17s
676:	learn: 3229.6942099	total: 8.7s	remaining: 4.15s
677:	learn: 3229.5712616	total: 8.71s	remaining: 4.14s
678:	learn: 3229.1736758	total: 8.72s	remaining: 4.12s
679:	learn: 3228.6801628	total: 8.73s	remaining: 4.11s
680:	learn: 3228.3137822	total: 8.74s	remaining: 4.09s
681:	learn: 3227.6376040	total: 8.75s	remaining: 4.08s
682:	learn: 3227.2163740	total: 8.76s	remaining: 4.06s
683:	learn: 3226.7370518	total: 8.77s	remaining: 4.05s
684:	learn: 3226.1747749	total: 8.78s	remaining: 4.04s
685:	learn: 3225.7558252	total: 8.79s	remaining: 4.02s
686:	learn: 3225.3979744	total: 8.8s	remaining: 4.01s
687:	learn: 3225.0256461	total: 8.81s	remaining: 4s
688:	learn: 3224.6321402	total: 8.82s	remaining: 3.98s
689:	learn: 3224.2012773	total: 8.83s	remaining: 3.97s
690:	learn: 3224.1118033	total: 8.84s	remaining: 3.95s
691:	learn: 3223.5764645	total: 8.85s	remaining: 3.94s
692:	learn: 3223.3667407	total: 8.86s	remaining: 3.92s
693:	learn: 3223.0433029	total: 8.87s	remaining: 3.91s
694:	learn: 3222.2398278	total: 8.88s	remaining: 3.9s
695:	learn: 3221.8143916	total: 8.88s	remaining: 3.88s
696:	learn: 3221.1967511	total: 8.89s	remaining: 3.87s
697:	learn: 3220.8691532	total: 8.9s	remaining: 3.85s
698:	learn: 3220.2376644	total: 8.91s	remaining: 3.84s
699:	learn: 3219.6803817	total: 8.92s	remaining: 3.82s
700:	learn: 3219.3339029	total: 8.93s	remaining: 3.81s
701:	learn: 3218.4231167	total: 8.94s	remaining: 3.79s
702:	learn: 3218.1951928	total: 8.95s	remaining: 3.78s
703:	learn: 3217.6033199	total: 8.96s	remaining: 3.77s
704:	learn: 3217.3508898	total: 8.97s	remaining: 3.75s
705:	learn: 3217.0614051	total: 8.98s	remaining: 3.74s
706:	learn: 3216.7065470	total: 8.99s	remaining: 3.73s
707:	learn: 3216.3652933	total: 9s	remaining: 3.71s
708:	learn: 3216.2738288	total: 9.01s	remaining: 3.7s
709:	learn: 3215.7820380	total: 9.02s	remaining: 3.68s
710:	learn: 3215.1832349	total: 9.03s	remaining: 3.67s
711:	learn: 3214.2190986	total: 9.04s	remaining: 3.65s
712:	learn: 3213.7882971	total: 9.04s	remaining: 3.64s
713:	learn: 3213.3820516	total: 9.05s	remaining: 3.63s
714:	learn: 3213.0798391	total: 9.06s	remaining: 3.61s
715:	learn: 3212.4817679	total: 9.07s	remaining: 3.6s
716:	learn: 3212.0866662	total: 9.08s	remaining: 3.58s
717:	learn: 3211.7341611	total: 9.09s	remaining: 3.57s
718:	learn: 3211.3643594	total: 9.1s	remaining: 3.56s
719:	learn: 3211.1725695	total: 9.12s	remaining: 3.54s
720:	learn: 3210.8923070	total: 9.13s	remaining: 3.53s
721:	learn: 3210.4506700	total: 9.14s	remaining: 3.52s
722:	learn: 3209.7220751	total: 9.15s	remaining: 3.5s
723:	learn: 3209.1096885	total: 9.16s	remaining: 3.49s
724:	learn: 3208.2243662	total: 9.17s	remaining: 3.48s
725:	learn: 3207.9699408	total: 9.19s	remaining: 3.47s
726:	learn: 3207.9363333	total: 9.2s	remaining: 3.45s
727:	learn: 3207.7998954	total: 9.21s	remaining: 3.44s
728:	learn: 3207.4135086	total: 9.21s	remaining: 3.42s
729:	learn: 3207.2677793	total: 9.22s	remaining: 3.41s
730:	learn: 3206.5613825	total: 9.23s	remaining: 3.4s
731:	learn: 3205.8172026	total: 9.24s	remaining: 3.38s
732:	learn: 3205.2505398	total: 9.25s	remaining: 3.37s
733:	learn: 3204.9013205	total: 9.26s	remaining: 3.35s
734:	learn: 3204.5543766	total: 9.27s	remaining: 3.34s
735:	learn: 3204.3793904	total: 9.28s	remaining: 3.33s
736:	learn: 3204.0157648	total: 9.29s	remaining: 3.31s
737:	learn: 3203.6043258	total: 9.29s	remaining: 3.3s
738:	learn: 3203.2921852	total: 9.3s	remaining: 3.29s
739:	learn: 3202.9927568	total: 9.31s	remaining: 3.27s
740:	learn: 3202.6269864	total: 9.32s	remaining: 3.26s
741:	learn: 3201.8216119	total: 9.33s	remaining: 3.24s
742:	learn: 3201.3091125	total: 9.34s	remaining: 3.23s
743:	learn: 3200.6281609	total: 9.35s	remaining: 3.22s
744:	learn: 3200.0993050	total: 9.36s	remaining: 3.2s
745:	learn: 3199.6750256	total: 9.37s	remaining: 3.19s
746:	learn: 3199.3552438	total: 9.38s	remaining: 3.17s
747:	learn: 3198.9508761	total: 9.39s	remaining: 3.17s
748:	learn: 3198.8421135	total: 9.41s	remaining: 3.15s
749:	learn: 3198.3761034	total: 9.42s	remaining: 3.14s
750:	learn: 3198.1075082	total: 9.43s	remaining: 3.13s
751:	learn: 3197.6479952	total: 9.44s	remaining: 3.11s
752:	learn: 3197.5362794	total: 9.45s	remaining: 3.1s
753:	learn: 3197.3333805	total: 9.46s	remaining: 3.08s
754:	learn: 3197.1845176	total: 9.47s	remaining: 3.07s
755:	learn: 3196.9748737	total: 9.48s	remaining: 3.06s
756:	learn: 3196.2303313	total: 9.49s	remaining: 3.05s
757:	learn: 3195.9352305	total: 9.5s	remaining: 3.03s
758:	learn: 3195.3527149	total: 9.51s	remaining: 3.02s
759:	learn: 3194.6760340	total: 9.52s	remaining: 3.01s
760:	learn: 3194.0059101	total: 9.53s	remaining: 2.99s
761:	learn: 3193.6445787	total: 9.54s	remaining: 2.98s
762:	learn: 3193.0495893	total: 9.55s	remaining: 2.96s
763:	learn: 3192.5862841	total: 9.56s	remaining: 2.95s
764:	learn: 3192.0468804	total: 9.56s	remaining: 2.94s
765:	learn: 3191.6150478	total: 9.57s	remaining: 2.92s
766:	learn: 3191.4355129	total: 9.58s	remaining: 2.91s
767:	learn: 3191.3568067	total: 9.6s	remaining: 2.9s
768:	learn: 3191.1501248	total: 9.62s	remaining: 2.89s
769:	learn: 3190.5977936	total: 9.63s	remaining: 2.88s
770:	learn: 3190.3652262	total: 9.64s	remaining: 2.86s
771:	learn: 3190.1131650	total: 9.65s	remaining: 2.85s
772:	learn: 3189.8371964	total: 9.65s	remaining: 2.83s
773:	learn: 3189.5709425	total: 9.66s	remaining: 2.82s
774:	learn: 3189.4875555	total: 9.67s	remaining: 2.81s
775:	learn: 3189.3605929	total: 9.68s	remaining: 2.79s
776:	learn: 3189.0984869	total: 9.69s	remaining: 2.78s
777:	learn: 3188.2584068	total: 9.7s	remaining: 2.77s
778:	learn: 3187.8967430	total: 9.71s	remaining: 2.75s
779:	learn: 3187.2645862	total: 9.72s	remaining: 2.74s
780:	learn: 3187.0703554	total: 9.73s	remaining: 2.73s
781:	learn: 3186.6635096	total: 9.74s	remaining: 2.71s
782:	learn: 3186.2032054	total: 9.75s	remaining: 2.7s
783:	learn: 3185.9465291	total: 9.76s	remaining: 2.69s
784:	learn: 3185.6031837	total: 9.76s	remaining: 2.67s
785:	learn: 3185.1793632	total: 9.77s	remaining: 2.66s
786:	learn: 3184.9404030	total: 9.78s	remaining: 2.65s
787:	learn: 3184.6090763	total: 9.79s	remaining: 2.63s
788:	learn: 3184.3193376	total: 9.81s	remaining: 2.62s
789:	learn: 3183.9914798	total: 9.82s	remaining: 2.61s
790:	learn: 3183.7279223	total: 9.82s	remaining: 2.6s
791:	learn: 3183.2507995	total: 9.83s	remaining: 2.58s
792:	learn: 3182.8618094	total: 9.84s	remaining: 2.57s
793:	learn: 3182.6986418	total: 9.85s	remaining: 2.56s
794:	learn: 3182.0709188	total: 9.86s	remaining: 2.54s
795:	learn: 3181.7057313	total: 9.87s	remaining: 2.53s
796:	learn: 3180.9869019	total: 9.88s	remaining: 2.52s
797:	learn: 3180.5630455	total: 9.89s	remaining: 2.5s
798:	learn: 3179.9468585	total: 9.9s	remaining: 2.49s
799:	learn: 3179.4463982	total: 9.91s	remaining: 2.48s
800:	learn: 3179.0778717	total: 9.91s	remaining: 2.46s
801:	learn: 3178.4433387	total: 9.92s	remaining: 2.45s
802:	learn: 3178.1489436	total: 9.93s	remaining: 2.44s
803:	learn: 3177.6026653	total: 9.94s	remaining: 2.42s
804:	learn: 3177.2539764	total: 9.95s	remaining: 2.41s
805:	learn: 3176.8421014	total: 9.96s	remaining: 2.4s
806:	learn: 3176.4752603	total: 9.97s	remaining: 2.38s
807:	learn: 3176.2190994	total: 9.98s	remaining: 2.37s
808:	learn: 3175.8328402	total: 9.99s	remaining: 2.36s
809:	learn: 3175.5173609	total: 10s	remaining: 2.35s
810:	learn: 3175.1531144	total: 10s	remaining: 2.33s
811:	learn: 3174.6663591	total: 10s	remaining: 2.32s
812:	learn: 3174.2050454	total: 10s	remaining: 2.31s
813:	learn: 3173.6101243	total: 10s	remaining: 2.3s
814:	learn: 3173.0654205	total: 10.1s	remaining: 2.28s
815:	learn: 3172.6364949	total: 10.1s	remaining: 2.27s
816:	learn: 3171.7895934	total: 10.1s	remaining: 2.26s
817:	learn: 3171.3936302	total: 10.1s	remaining: 2.24s
818:	learn: 3170.6599977	total: 10.1s	remaining: 2.23s
819:	learn: 3170.2496368	total: 10.1s	remaining: 2.22s
820:	learn: 3170.0460009	total: 10.1s	remaining: 2.2s
821:	learn: 3169.1877259	total: 10.1s	remaining: 2.19s
822:	learn: 3168.7486915	total: 10.1s	remaining: 2.18s
823:	learn: 3168.5804885	total: 10.1s	remaining: 2.16s
824:	learn: 3168.0868289	total: 10.1s	remaining: 2.15s
825:	learn: 3167.9088793	total: 10.2s	remaining: 2.14s
826:	learn: 3167.6257466	total: 10.2s	remaining: 2.13s
827:	learn: 3167.0047304	total: 10.2s	remaining: 2.11s
828:	learn: 3166.7706323	total: 10.2s	remaining: 2.1s
829:	learn: 3166.2797208	total: 10.2s	remaining: 2.09s
830:	learn: 3166.0534671	total: 10.2s	remaining: 2.07s
831:	learn: 3165.6716449	total: 10.2s	remaining: 2.06s
832:	learn: 3165.3454049	total: 10.2s	remaining: 2.05s
833:	learn: 3164.8409938	total: 10.2s	remaining: 2.04s
834:	learn: 3164.4713351	total: 10.2s	remaining: 2.02s
835:	learn: 3164.2255848	total: 10.3s	remaining: 2.01s
836:	learn: 3163.9366621	total: 10.3s	remaining: 2s
837:	learn: 3163.6129314	total: 10.3s	remaining: 1.99s
838:	learn: 3163.2329774	total: 10.3s	remaining: 1.97s
839:	learn: 3162.9358887	total: 10.3s	remaining: 1.96s
840:	learn: 3162.6126073	total: 10.3s	remaining: 1.95s
841:	learn: 3162.0806728	total: 10.3s	remaining: 1.93s
842:	learn: 3161.9017990	total: 10.3s	remaining: 1.92s
843:	learn: 3161.0001800	total: 10.3s	remaining: 1.91s
844:	learn: 3160.5013081	total: 10.3s	remaining: 1.9s
845:	learn: 3160.2529856	total: 10.3s	remaining: 1.88s
846:	learn: 3159.9673485	total: 10.3s	remaining: 1.87s
847:	learn: 3159.4291306	total: 10.4s	remaining: 1.86s
848:	learn: 3158.8698960	total: 10.4s	remaining: 1.84s
849:	learn: 3158.1854118	total: 10.4s	remaining: 1.83s
850:	learn: 3157.8648786	total: 10.4s	remaining: 1.82s
851:	learn: 3157.2423488	total: 10.4s	remaining: 1.8s
852:	learn: 3156.7797914	total: 10.4s	remaining: 1.79s
853:	learn: 3156.4272632	total: 10.4s	remaining: 1.78s
854:	learn: 3156.1731447	total: 10.4s	remaining: 1.77s
855:	learn: 3155.6469015	total: 10.4s	remaining: 1.76s
856:	learn: 3155.4197583	total: 10.5s	remaining: 1.75s
857:	learn: 3154.9550927	total: 10.5s	remaining: 1.73s
858:	learn: 3154.7808475	total: 10.5s	remaining: 1.72s
859:	learn: 3154.6366348	total: 10.5s	remaining: 1.71s
860:	learn: 3154.4005834	total: 10.5s	remaining: 1.7s
861:	learn: 3153.6766663	total: 10.5s	remaining: 1.68s
862:	learn: 3153.1792472	total: 10.5s	remaining: 1.67s
863:	learn: 3153.0751992	total: 10.5s	remaining: 1.66s
864:	learn: 3152.8279642	total: 10.5s	remaining: 1.64s
865:	learn: 3152.1895007	total: 10.5s	remaining: 1.63s
866:	learn: 3151.8296916	total: 10.6s	remaining: 1.62s
867:	learn: 3151.3869212	total: 10.6s	remaining: 1.61s
868:	learn: 3151.1803217	total: 10.6s	remaining: 1.59s
869:	learn: 3150.3440245	total: 10.6s	remaining: 1.58s
870:	learn: 3149.7795946	total: 10.6s	remaining: 1.57s
871:	learn: 3149.3204547	total: 10.6s	remaining: 1.56s
872:	learn: 3148.8916660	total: 10.6s	remaining: 1.54s
873:	learn: 3148.7738987	total: 10.6s	remaining: 1.53s
874:	learn: 3148.5326185	total: 10.6s	remaining: 1.52s
875:	learn: 3148.0549575	total: 10.6s	remaining: 1.51s
876:	learn: 3147.7864531	total: 10.7s	remaining: 1.49s
877:	learn: 3147.4724371	total: 10.7s	remaining: 1.48s
878:	learn: 3146.4323362	total: 10.7s	remaining: 1.47s
879:	learn: 3145.9612987	total: 10.7s	remaining: 1.46s
880:	learn: 3145.5022651	total: 10.7s	remaining: 1.44s
881:	learn: 3144.8866447	total: 10.7s	remaining: 1.43s
882:	learn: 3144.6850435	total: 10.7s	remaining: 1.42s
883:	learn: 3144.3570662	total: 10.7s	remaining: 1.41s
884:	learn: 3143.9140303	total: 10.7s	remaining: 1.39s
885:	learn: 3143.4201679	total: 10.7s	remaining: 1.38s
886:	learn: 3143.0634149	total: 10.7s	remaining: 1.37s
887:	learn: 3142.9191948	total: 10.8s	remaining: 1.36s
888:	learn: 3142.7229597	total: 10.8s	remaining: 1.34s
889:	learn: 3142.4891256	total: 10.8s	remaining: 1.33s
890:	learn: 3142.0561798	total: 10.8s	remaining: 1.32s
891:	learn: 3141.6390193	total: 10.8s	remaining: 1.31s
892:	learn: 3141.3360557	total: 10.8s	remaining: 1.29s
893:	learn: 3140.9189391	total: 10.8s	remaining: 1.28s
894:	learn: 3140.4028097	total: 10.8s	remaining: 1.27s
895:	learn: 3140.3639975	total: 10.8s	remaining: 1.26s
896:	learn: 3139.9565508	total: 10.8s	remaining: 1.25s
897:	learn: 3139.5988310	total: 10.9s	remaining: 1.23s
898:	learn: 3139.3351042	total: 10.9s	remaining: 1.22s
899:	learn: 3139.1720920	total: 10.9s	remaining: 1.21s
900:	learn: 3138.8572076	total: 10.9s	remaining: 1.2s
901:	learn: 3138.4942879	total: 10.9s	remaining: 1.18s
902:	learn: 3137.9626917	total: 10.9s	remaining: 1.17s
903:	learn: 3137.4334112	total: 10.9s	remaining: 1.16s
904:	learn: 3137.0217935	total: 10.9s	remaining: 1.15s
905:	learn: 3136.7819398	total: 10.9s	remaining: 1.13s
906:	learn: 3136.3887738	total: 10.9s	remaining: 1.12s
907:	learn: 3136.1963917	total: 10.9s	remaining: 1.11s
908:	learn: 3135.9459659	total: 10.9s	remaining: 1.09s
909:	learn: 3135.7165372	total: 11s	remaining: 1.08s
910:	learn: 3135.0368672	total: 11s	remaining: 1.07s
911:	learn: 3134.6611648	total: 11s	remaining: 1.06s
912:	learn: 3134.1851618	total: 11s	remaining: 1.05s
913:	learn: 3133.5062421	total: 11s	remaining: 1.03s
914:	learn: 3133.2312080	total: 11s	remaining: 1.02s
915:	learn: 3132.7403983	total: 11s	remaining: 1.01s
916:	learn: 3132.4352069	total: 11s	remaining: 998ms
917:	learn: 3131.7610089	total: 11s	remaining: 986ms
918:	learn: 3131.3400578	total: 11s	remaining: 974ms
919:	learn: 3131.0003484	total: 11.1s	remaining: 961ms
920:	learn: 3130.5253730	total: 11.1s	remaining: 949ms
921:	learn: 3130.1523804	total: 11.1s	remaining: 937ms
922:	learn: 3129.7873163	total: 11.1s	remaining: 925ms
923:	learn: 3129.3367480	total: 11.1s	remaining: 913ms
924:	learn: 3128.8340038	total: 11.1s	remaining: 901ms
925:	learn: 3128.5179084	total: 11.1s	remaining: 889ms
926:	learn: 3128.2100574	total: 11.1s	remaining: 876ms
927:	learn: 3127.7891447	total: 11.1s	remaining: 864ms
928:	learn: 3127.0525286	total: 11.1s	remaining: 852ms
929:	learn: 3126.3476566	total: 11.2s	remaining: 840ms
930:	learn: 3125.8600780	total: 11.2s	remaining: 828ms
931:	learn: 3125.1825641	total: 11.2s	remaining: 815ms
932:	learn: 3124.7490876	total: 11.2s	remaining: 803ms
933:	learn: 3124.2392081	total: 11.2s	remaining: 791ms
934:	learn: 3123.9159739	total: 11.2s	remaining: 779ms
935:	learn: 3123.5783229	total: 11.2s	remaining: 766ms
936:	learn: 3123.1632732	total: 11.2s	remaining: 754ms
937:	learn: 3122.9243718	total: 11.2s	remaining: 742ms
938:	learn: 3122.7346461	total: 11.2s	remaining: 730ms
939:	learn: 3122.4706925	total: 11.3s	remaining: 718ms
940:	learn: 3122.0479631	total: 11.3s	remaining: 706ms
941:	learn: 3121.7808104	total: 11.3s	remaining: 694ms
942:	learn: 3121.3027972	total: 11.3s	remaining: 682ms
943:	learn: 3121.0480085	total: 11.3s	remaining: 670ms
944:	learn: 3120.7415931	total: 11.3s	remaining: 658ms
945:	learn: 3120.3613856	total: 11.3s	remaining: 646ms
946:	learn: 3119.5775896	total: 11.3s	remaining: 633ms
947:	learn: 3119.1747003	total: 11.3s	remaining: 621ms
948:	learn: 3119.0695851	total: 11.3s	remaining: 609ms
949:	learn: 3118.8588423	total: 11.3s	remaining: 597ms
950:	learn: 3118.4830087	total: 11.4s	remaining: 585ms
951:	learn: 3118.1661127	total: 11.4s	remaining: 573ms
952:	learn: 3117.9990590	total: 11.4s	remaining: 561ms
953:	learn: 3117.5754600	total: 11.4s	remaining: 549ms
954:	learn: 3117.0354052	total: 11.4s	remaining: 537ms
955:	learn: 3116.7422668	total: 11.4s	remaining: 525ms
956:	learn: 3116.2911666	total: 11.4s	remaining: 513ms
957:	learn: 3115.9140021	total: 11.4s	remaining: 501ms
958:	learn: 3115.4895706	total: 11.4s	remaining: 489ms
959:	learn: 3115.2889531	total: 11.5s	remaining: 478ms
960:	learn: 3114.8577669	total: 11.5s	remaining: 465ms
961:	learn: 3114.6274096	total: 11.5s	remaining: 453ms
962:	learn: 3114.4536635	total: 11.5s	remaining: 441ms
963:	learn: 3114.1863304	total: 11.5s	remaining: 429ms
964:	learn: 3113.8707039	total: 11.5s	remaining: 417ms
965:	learn: 3113.4205562	total: 11.5s	remaining: 405ms
966:	learn: 3113.0602475	total: 11.5s	remaining: 393ms
967:	learn: 3112.8384557	total: 11.5s	remaining: 381ms
968:	learn: 3112.4663911	total: 11.5s	remaining: 369ms
969:	learn: 3111.6211462	total: 11.5s	remaining: 357ms
970:	learn: 3111.1960276	total: 11.6s	remaining: 345ms
971:	learn: 3110.7635308	total: 11.6s	remaining: 333ms
972:	learn: 3110.6117281	total: 11.6s	remaining: 321ms
973:	learn: 3110.0103569	total: 11.6s	remaining: 309ms
974:	learn: 3109.7106983	total: 11.6s	remaining: 297ms
975:	learn: 3109.5791762	total: 11.6s	remaining: 285ms
976:	learn: 3109.1608683	total: 11.6s	remaining: 273ms
977:	learn: 3108.7725748	total: 11.6s	remaining: 261ms
978:	learn: 3108.5759201	total: 11.6s	remaining: 249ms
979:	learn: 3108.1522097	total: 11.6s	remaining: 238ms
980:	learn: 3107.5828003	total: 11.7s	remaining: 226ms
981:	learn: 3107.2785967	total: 11.7s	remaining: 214ms
982:	learn: 3107.1017910	total: 11.7s	remaining: 202ms
983:	learn: 3106.6602807	total: 11.7s	remaining: 190ms
984:	learn: 3106.4887450	total: 11.7s	remaining: 178ms
985:	learn: 3105.9727115	total: 11.7s	remaining: 166ms
986:	learn: 3105.6471744	total: 11.7s	remaining: 154ms
987:	learn: 3105.0701939	total: 11.7s	remaining: 142ms
988:	learn: 3104.9332874	total: 11.7s	remaining: 130ms
989:	learn: 3104.4277595	total: 11.7s	remaining: 119ms
990:	learn: 3104.1086527	total: 11.7s	remaining: 107ms
991:	learn: 3103.4133448	total: 11.8s	remaining: 94.8ms
992:	learn: 3103.0978630	total: 11.8s	remaining: 82.9ms
993:	learn: 3102.7760674	total: 11.8s	remaining: 71.1ms
994:	learn: 3101.8481565	total: 11.8s	remaining: 59.2ms
995:	learn: 3101.3265837	total: 11.8s	remaining: 47.4ms
996:	learn: 3101.2198023	total: 11.8s	remaining: 35.5ms
997:	learn: 3101.1533242	total: 11.8s	remaining: 23.7ms
998:	learn: 3100.6497386	total: 11.8s	remaining: 11.8ms
999:	learn: 3100.4769245	total: 11.8s	remaining: 0us
0:	learn: 11568.9461820	total: 11.6ms	remaining: 11.6s
1:	learn: 11255.4219782	total: 23.4ms	remaining: 11.7s
2:	learn: 10952.9269162	total: 36.6ms	remaining: 12.2s
3:	learn: 10659.9285256	total: 49.9ms	remaining: 12.4s
4:	learn: 10378.5547706	total: 62.5ms	remaining: 12.4s
5:	learn: 10118.6026634	total: 74.3ms	remaining: 12.3s
6:	learn: 9857.2474236	total: 86.8ms	remaining: 12.3s
7:	learn: 9600.8051052	total: 99ms	remaining: 12.3s
8:	learn: 9353.6804863	total: 111ms	remaining: 12.2s
9:	learn: 9117.3655685	total: 123ms	remaining: 12.1s
10:	learn: 8892.3738775	total: 134ms	remaining: 12s
11:	learn: 8681.0408424	total: 147ms	remaining: 12.1s
12:	learn: 8467.3329836	total: 159ms	remaining: 12.1s
13:	learn: 8273.2671597	total: 172ms	remaining: 12.1s
14:	learn: 8080.3193717	total: 185ms	remaining: 12.2s
15:	learn: 7893.0742315	total: 195ms	remaining: 12s
16:	learn: 7712.3598463	total: 206ms	remaining: 11.9s
17:	learn: 7545.1288432	total: 227ms	remaining: 12.4s
18:	learn: 7386.5960619	total: 247ms	remaining: 12.7s
19:	learn: 7226.3028640	total: 257ms	remaining: 12.6s
20:	learn: 7075.4107162	total: 267ms	remaining: 12.5s
21:	learn: 6932.0637550	total: 278ms	remaining: 12.4s
22:	learn: 6785.3174776	total: 290ms	remaining: 12.3s
23:	learn: 6655.2618090	total: 301ms	remaining: 12.2s
24:	learn: 6532.5689057	total: 312ms	remaining: 12.1s
25:	learn: 6415.4288176	total: 323ms	remaining: 12.1s
26:	learn: 6300.8464300	total: 333ms	remaining: 12s
27:	learn: 6184.9907367	total: 345ms	remaining: 12s
28:	learn: 6076.6147827	total: 355ms	remaining: 11.9s
29:	learn: 5978.2136677	total: 365ms	remaining: 11.8s
30:	learn: 5883.6864644	total: 373ms	remaining: 11.7s
31:	learn: 5788.9456810	total: 394ms	remaining: 11.9s
32:	learn: 5702.2860309	total: 403ms	remaining: 11.8s
33:	learn: 5614.1094818	total: 414ms	remaining: 11.7s
34:	learn: 5535.5977224	total: 421ms	remaining: 11.6s
35:	learn: 5457.1556043	total: 437ms	remaining: 11.7s
36:	learn: 5380.4330738	total: 446ms	remaining: 11.6s
37:	learn: 5308.7615144	total: 456ms	remaining: 11.5s
38:	learn: 5237.7222598	total: 466ms	remaining: 11.5s
39:	learn: 5171.7488285	total: 477ms	remaining: 11.4s
40:	learn: 5110.1510905	total: 485ms	remaining: 11.3s
41:	learn: 5050.0917900	total: 494ms	remaining: 11.3s
42:	learn: 4993.7625208	total: 503ms	remaining: 11.2s
43:	learn: 4938.5040914	total: 512ms	remaining: 11.1s
44:	learn: 4883.7951081	total: 522ms	remaining: 11.1s
45:	learn: 4834.2453077	total: 530ms	remaining: 11s
46:	learn: 4788.5952018	total: 537ms	remaining: 10.9s
47:	learn: 4743.9841143	total: 546ms	remaining: 10.8s
48:	learn: 4704.9883327	total: 555ms	remaining: 10.8s
49:	learn: 4659.6896911	total: 573ms	remaining: 10.9s
50:	learn: 4617.3044938	total: 583ms	remaining: 10.8s
51:	learn: 4581.1154157	total: 592ms	remaining: 10.8s
52:	learn: 4543.6959910	total: 601ms	remaining: 10.7s
53:	learn: 4507.1365619	total: 611ms	remaining: 10.7s
54:	learn: 4473.4123948	total: 621ms	remaining: 10.7s
55:	learn: 4440.2343516	total: 632ms	remaining: 10.7s
56:	learn: 4411.0595945	total: 652ms	remaining: 10.8s
57:	learn: 4382.0002373	total: 662ms	remaining: 10.7s
58:	learn: 4353.1463259	total: 671ms	remaining: 10.7s
59:	learn: 4324.2731215	total: 680ms	remaining: 10.7s
60:	learn: 4297.9662271	total: 691ms	remaining: 10.6s
61:	learn: 4274.7737654	total: 699ms	remaining: 10.6s
62:	learn: 4251.7055686	total: 708ms	remaining: 10.5s
63:	learn: 4230.1469194	total: 716ms	remaining: 10.5s
64:	learn: 4207.3088125	total: 725ms	remaining: 10.4s
65:	learn: 4185.7814933	total: 734ms	remaining: 10.4s
66:	learn: 4167.6042184	total: 742ms	remaining: 10.3s
67:	learn: 4148.3951896	total: 752ms	remaining: 10.3s
68:	learn: 4132.4106096	total: 761ms	remaining: 10.3s
69:	learn: 4115.3212668	total: 770ms	remaining: 10.2s
70:	learn: 4097.8943389	total: 779ms	remaining: 10.2s
71:	learn: 4080.9511846	total: 789ms	remaining: 10.2s
72:	learn: 4064.0734937	total: 799ms	remaining: 10.1s
73:	learn: 4047.2991535	total: 809ms	remaining: 10.1s
74:	learn: 4032.0109908	total: 818ms	remaining: 10.1s
75:	learn: 4019.0972610	total: 830ms	remaining: 10.1s
76:	learn: 4005.7924486	total: 844ms	remaining: 10.1s
77:	learn: 3990.9105009	total: 854ms	remaining: 10.1s
78:	learn: 3978.2506409	total: 863ms	remaining: 10.1s
79:	learn: 3964.5179867	total: 873ms	remaining: 10s
80:	learn: 3952.8734919	total: 882ms	remaining: 10s
81:	learn: 3942.3208225	total: 892ms	remaining: 9.99s
82:	learn: 3932.2361464	total: 900ms	remaining: 9.95s
83:	learn: 3923.0853150	total: 909ms	remaining: 9.91s
84:	learn: 3913.6921262	total: 918ms	remaining: 9.88s
85:	learn: 3903.7452615	total: 927ms	remaining: 9.86s
86:	learn: 3893.6582118	total: 937ms	remaining: 9.84s
87:	learn: 3884.5015293	total: 947ms	remaining: 9.81s
88:	learn: 3875.8076681	total: 956ms	remaining: 9.79s
89:	learn: 3868.5061663	total: 966ms	remaining: 9.77s
90:	learn: 3860.0159564	total: 979ms	remaining: 9.78s
91:	learn: 3853.2211037	total: 991ms	remaining: 9.79s
92:	learn: 3846.3591766	total: 1s	remaining: 9.76s
93:	learn: 3840.2398550	total: 1.01s	remaining: 9.74s
94:	learn: 3834.3529365	total: 1.02s	remaining: 9.72s
95:	learn: 3828.1200296	total: 1.03s	remaining: 9.69s
96:	learn: 3821.7850878	total: 1.05s	remaining: 9.79s
97:	learn: 3815.7870291	total: 1.06s	remaining: 9.77s
98:	learn: 3809.1380701	total: 1.07s	remaining: 9.74s
99:	learn: 3802.8790668	total: 1.08s	remaining: 9.72s
100:	learn: 3796.6836147	total: 1.09s	remaining: 9.71s
101:	learn: 3790.8984302	total: 1.1s	remaining: 9.69s
102:	learn: 3784.3476424	total: 1.11s	remaining: 9.67s
103:	learn: 3778.3281194	total: 1.12s	remaining: 9.64s
104:	learn: 3772.9437483	total: 1.13s	remaining: 9.63s
105:	learn: 3767.5340442	total: 1.14s	remaining: 9.6s
106:	learn: 3762.5016385	total: 1.15s	remaining: 9.58s
107:	learn: 3757.1965330	total: 1.16s	remaining: 9.56s
108:	learn: 3752.2417248	total: 1.17s	remaining: 9.54s
109:	learn: 3747.6089834	total: 1.18s	remaining: 9.52s
110:	learn: 3743.5900113	total: 1.19s	remaining: 9.49s
111:	learn: 3739.0088844	total: 1.19s	remaining: 9.47s
112:	learn: 3734.3897108	total: 1.2s	remaining: 9.46s
113:	learn: 3729.5877445	total: 1.21s	remaining: 9.44s
114:	learn: 3724.4072426	total: 1.22s	remaining: 9.41s
115:	learn: 3720.4635262	total: 1.23s	remaining: 9.39s
116:	learn: 3716.7480700	total: 1.24s	remaining: 9.37s
117:	learn: 3712.8960773	total: 1.25s	remaining: 9.39s
118:	learn: 3710.1229295	total: 1.27s	remaining: 9.38s
119:	learn: 3705.6040249	total: 1.28s	remaining: 9.36s
120:	learn: 3702.9114635	total: 1.28s	remaining: 9.34s
121:	learn: 3699.8575427	total: 1.3s	remaining: 9.34s
122:	learn: 3696.2689660	total: 1.31s	remaining: 9.35s
123:	learn: 3692.7255650	total: 1.32s	remaining: 9.33s
124:	learn: 3688.7664104	total: 1.33s	remaining: 9.31s
125:	learn: 3686.0799945	total: 1.34s	remaining: 9.29s
126:	learn: 3683.2668658	total: 1.35s	remaining: 9.26s
127:	learn: 3679.8009377	total: 1.36s	remaining: 9.24s
128:	learn: 3676.9698258	total: 1.38s	remaining: 9.32s
129:	learn: 3674.5650299	total: 1.39s	remaining: 9.29s
130:	learn: 3671.9447661	total: 1.4s	remaining: 9.27s
131:	learn: 3669.0984885	total: 1.41s	remaining: 9.24s
132:	learn: 3666.4465416	total: 1.41s	remaining: 9.22s
133:	learn: 3663.0844502	total: 1.42s	remaining: 9.2s
134:	learn: 3660.4084273	total: 1.43s	remaining: 9.19s
135:	learn: 3657.7988696	total: 1.44s	remaining: 9.17s
136:	learn: 3655.1345821	total: 1.46s	remaining: 9.2s
137:	learn: 3652.6994656	total: 1.49s	remaining: 9.28s
138:	learn: 3649.4424609	total: 1.51s	remaining: 9.35s
139:	learn: 3646.7107786	total: 1.52s	remaining: 9.37s
140:	learn: 3644.0966545	total: 1.54s	remaining: 9.37s
141:	learn: 3642.0725466	total: 1.55s	remaining: 9.38s
142:	learn: 3639.2747685	total: 1.57s	remaining: 9.39s
143:	learn: 3636.6524751	total: 1.58s	remaining: 9.39s
144:	learn: 3633.8516474	total: 1.59s	remaining: 9.39s
145:	learn: 3631.5196310	total: 1.6s	remaining: 9.38s
146:	learn: 3629.1573056	total: 1.62s	remaining: 9.38s
147:	learn: 3627.0711235	total: 1.63s	remaining: 9.38s
148:	learn: 3625.2623680	total: 1.64s	remaining: 9.36s
149:	learn: 3623.2706123	total: 1.66s	remaining: 9.39s
150:	learn: 3621.0733624	total: 1.68s	remaining: 9.45s
151:	learn: 3619.5344320	total: 1.7s	remaining: 9.49s
152:	learn: 3617.8701945	total: 1.71s	remaining: 9.48s
153:	learn: 3615.8241325	total: 1.73s	remaining: 9.49s
154:	learn: 3613.3714025	total: 1.76s	remaining: 9.59s
155:	learn: 3611.4965309	total: 1.78s	remaining: 9.65s
156:	learn: 3609.5436604	total: 1.81s	remaining: 9.75s
157:	learn: 3607.7334087	total: 1.84s	remaining: 9.79s
158:	learn: 3605.8581634	total: 1.86s	remaining: 9.85s
159:	learn: 3604.3152532	total: 1.88s	remaining: 9.89s
160:	learn: 3602.8675031	total: 1.91s	remaining: 9.96s
161:	learn: 3601.1453983	total: 1.94s	remaining: 10s
162:	learn: 3599.3032681	total: 1.97s	remaining: 10.1s
163:	learn: 3598.3469747	total: 2s	remaining: 10.2s
164:	learn: 3597.0448643	total: 2.02s	remaining: 10.2s
165:	learn: 3595.3196373	total: 2.05s	remaining: 10.3s
166:	learn: 3592.8157514	total: 2.08s	remaining: 10.4s
167:	learn: 3591.2270079	total: 2.11s	remaining: 10.4s
168:	learn: 3589.7152885	total: 2.12s	remaining: 10.4s
169:	learn: 3588.3887241	total: 2.13s	remaining: 10.4s
170:	learn: 3586.7982124	total: 2.15s	remaining: 10.4s
171:	learn: 3584.8685680	total: 2.16s	remaining: 10.4s
172:	learn: 3583.1633930	total: 2.18s	remaining: 10.4s
173:	learn: 3581.7250202	total: 2.21s	remaining: 10.5s
174:	learn: 3580.6291774	total: 2.24s	remaining: 10.6s
175:	learn: 3578.5882594	total: 2.27s	remaining: 10.6s
176:	learn: 3577.2014275	total: 2.3s	remaining: 10.7s
177:	learn: 3576.0429080	total: 2.32s	remaining: 10.7s
178:	learn: 3574.4183127	total: 2.35s	remaining: 10.8s
179:	learn: 3573.2768721	total: 2.37s	remaining: 10.8s
180:	learn: 3571.7468017	total: 2.41s	remaining: 10.9s
181:	learn: 3570.4722250	total: 2.44s	remaining: 11s
182:	learn: 3568.9216071	total: 2.46s	remaining: 11s
183:	learn: 3567.4276395	total: 2.49s	remaining: 11s
184:	learn: 3566.2160053	total: 2.51s	remaining: 11.1s
185:	learn: 3564.9489031	total: 2.54s	remaining: 11.1s
186:	learn: 3563.4379249	total: 2.56s	remaining: 11.1s
187:	learn: 3562.0307074	total: 2.59s	remaining: 11.2s
188:	learn: 3560.7610063	total: 2.62s	remaining: 11.2s
189:	learn: 3559.0828255	total: 2.65s	remaining: 11.3s
190:	learn: 3557.4090362	total: 2.68s	remaining: 11.3s
191:	learn: 3556.2497278	total: 2.69s	remaining: 11.3s
192:	learn: 3555.0526353	total: 2.7s	remaining: 11.3s
193:	learn: 3553.3457868	total: 2.72s	remaining: 11.3s
194:	learn: 3552.3493018	total: 2.75s	remaining: 11.3s
195:	learn: 3550.8507340	total: 2.77s	remaining: 11.3s
196:	learn: 3549.8263012	total: 2.78s	remaining: 11.3s
197:	learn: 3548.8493445	total: 2.8s	remaining: 11.3s
198:	learn: 3547.1648554	total: 2.81s	remaining: 11.3s
199:	learn: 3546.0246183	total: 2.83s	remaining: 11.3s
200:	learn: 3545.1182525	total: 2.83s	remaining: 11.3s
201:	learn: 3543.5551866	total: 2.85s	remaining: 11.2s
202:	learn: 3542.6777330	total: 2.86s	remaining: 11.2s
203:	learn: 3541.8694915	total: 2.87s	remaining: 11.2s
204:	learn: 3540.8357259	total: 2.88s	remaining: 11.2s
205:	learn: 3539.6570343	total: 2.9s	remaining: 11.2s
206:	learn: 3538.3947320	total: 2.92s	remaining: 11.2s
207:	learn: 3536.9053159	total: 2.95s	remaining: 11.2s
208:	learn: 3535.7382967	total: 2.97s	remaining: 11.2s
209:	learn: 3534.9539046	total: 3s	remaining: 11.3s
210:	learn: 3533.5907987	total: 3.03s	remaining: 11.3s
211:	learn: 3532.7084874	total: 3.06s	remaining: 11.4s
212:	learn: 3531.7520983	total: 3.08s	remaining: 11.4s
213:	learn: 3530.8606440	total: 3.11s	remaining: 11.4s
214:	learn: 3529.7290561	total: 3.14s	remaining: 11.5s
215:	learn: 3527.7879598	total: 3.15s	remaining: 11.5s
216:	learn: 3526.7621310	total: 3.17s	remaining: 11.4s
217:	learn: 3525.2039994	total: 3.18s	remaining: 11.4s
218:	learn: 3524.2232006	total: 3.19s	remaining: 11.4s
219:	learn: 3523.2346792	total: 3.2s	remaining: 11.4s
220:	learn: 3522.0947443	total: 3.21s	remaining: 11.3s
221:	learn: 3520.4067178	total: 3.23s	remaining: 11.3s
222:	learn: 3519.5023168	total: 3.24s	remaining: 11.3s
223:	learn: 3518.8917411	total: 3.25s	remaining: 11.3s
224:	learn: 3517.9203250	total: 3.26s	remaining: 11.2s
225:	learn: 3516.7587886	total: 3.27s	remaining: 11.2s
226:	learn: 3515.4092016	total: 3.29s	remaining: 11.2s
227:	learn: 3514.2127703	total: 3.3s	remaining: 11.2s
228:	learn: 3513.0385297	total: 3.31s	remaining: 11.1s
229:	learn: 3511.7804926	total: 3.34s	remaining: 11.2s
230:	learn: 3511.0063900	total: 3.38s	remaining: 11.2s
231:	learn: 3509.2577594	total: 3.4s	remaining: 11.3s
232:	learn: 3508.2388764	total: 3.43s	remaining: 11.3s
233:	learn: 3507.3931051	total: 3.45s	remaining: 11.3s
234:	learn: 3506.6729340	total: 3.48s	remaining: 11.3s
235:	learn: 3505.3300358	total: 3.5s	remaining: 11.3s
236:	learn: 3504.8436130	total: 3.52s	remaining: 11.3s
237:	learn: 3503.5007995	total: 3.56s	remaining: 11.4s
238:	learn: 3502.7907777	total: 3.58s	remaining: 11.4s
239:	learn: 3502.0887814	total: 3.6s	remaining: 11.4s
240:	learn: 3501.3313438	total: 3.63s	remaining: 11.4s
241:	learn: 3500.3062442	total: 3.65s	remaining: 11.4s
242:	learn: 3499.3183927	total: 3.68s	remaining: 11.5s
243:	learn: 3497.8990570	total: 3.71s	remaining: 11.5s
244:	learn: 3497.0744033	total: 3.74s	remaining: 11.5s
245:	learn: 3496.5096106	total: 3.77s	remaining: 11.6s
246:	learn: 3495.4308181	total: 3.79s	remaining: 11.6s
247:	learn: 3494.1546559	total: 3.83s	remaining: 11.6s
248:	learn: 3493.3666800	total: 3.85s	remaining: 11.6s
249:	learn: 3492.1870364	total: 3.88s	remaining: 11.6s
250:	learn: 3491.0659874	total: 3.91s	remaining: 11.7s
251:	learn: 3489.1221943	total: 3.94s	remaining: 11.7s
252:	learn: 3488.6267285	total: 3.95s	remaining: 11.7s
253:	learn: 3487.8513812	total: 3.99s	remaining: 11.7s
254:	learn: 3487.1699679	total: 4.01s	remaining: 11.7s
255:	learn: 3485.9874828	total: 4.02s	remaining: 11.7s
256:	learn: 3485.4504048	total: 4.03s	remaining: 11.7s
257:	learn: 3485.0655215	total: 4.04s	remaining: 11.6s
258:	learn: 3483.7332281	total: 4.05s	remaining: 11.6s
259:	learn: 3483.1391163	total: 4.07s	remaining: 11.6s
260:	learn: 3482.1569122	total: 4.08s	remaining: 11.5s
261:	learn: 3480.8694570	total: 4.09s	remaining: 11.5s
262:	learn: 3479.6490696	total: 4.12s	remaining: 11.5s
263:	learn: 3478.2281609	total: 4.14s	remaining: 11.6s
264:	learn: 3477.7023034	total: 4.17s	remaining: 11.6s
265:	learn: 3477.0373881	total: 4.19s	remaining: 11.6s
266:	learn: 3476.0874686	total: 4.21s	remaining: 11.5s
267:	learn: 3475.4673605	total: 4.22s	remaining: 11.5s
268:	learn: 3474.3437780	total: 4.24s	remaining: 11.5s
269:	learn: 3473.9134134	total: 4.26s	remaining: 11.5s
270:	learn: 3472.9849934	total: 4.29s	remaining: 11.5s
271:	learn: 3472.5729133	total: 4.32s	remaining: 11.6s
272:	learn: 3471.8259620	total: 4.34s	remaining: 11.6s
273:	learn: 3471.2331281	total: 4.36s	remaining: 11.6s
274:	learn: 3470.4106790	total: 4.4s	remaining: 11.6s
275:	learn: 3469.3481798	total: 4.43s	remaining: 11.6s
276:	learn: 3469.0057270	total: 4.45s	remaining: 11.6s
277:	learn: 3468.3830352	total: 4.46s	remaining: 11.6s
278:	learn: 3467.0526559	total: 4.47s	remaining: 11.5s
279:	learn: 3466.4135677	total: 4.48s	remaining: 11.5s
280:	learn: 3465.2529112	total: 4.49s	remaining: 11.5s
281:	learn: 3464.4024873	total: 4.5s	remaining: 11.4s
282:	learn: 3463.7118342	total: 4.5s	remaining: 11.4s
283:	learn: 3462.7498398	total: 4.51s	remaining: 11.4s
284:	learn: 3462.0944849	total: 4.52s	remaining: 11.3s
285:	learn: 3461.2577853	total: 4.53s	remaining: 11.3s
286:	learn: 3460.4476494	total: 4.54s	remaining: 11.3s
287:	learn: 3459.4541932	total: 4.55s	remaining: 11.2s
288:	learn: 3459.0075958	total: 4.56s	remaining: 11.2s
289:	learn: 3458.0905302	total: 4.57s	remaining: 11.2s
290:	learn: 3457.4332147	total: 4.57s	remaining: 11.1s
291:	learn: 3456.1082026	total: 4.58s	remaining: 11.1s
292:	learn: 3455.0706557	total: 4.59s	remaining: 11.1s
293:	learn: 3453.6809472	total: 4.62s	remaining: 11.1s
294:	learn: 3452.6823982	total: 4.64s	remaining: 11.1s
295:	learn: 3451.4250439	total: 4.65s	remaining: 11.1s
296:	learn: 3450.8020519	total: 4.66s	remaining: 11s
297:	learn: 3450.0995426	total: 4.66s	remaining: 11s
298:	learn: 3449.5382934	total: 4.67s	remaining: 11s
299:	learn: 3448.8891376	total: 4.68s	remaining: 10.9s
300:	learn: 3448.3805100	total: 4.69s	remaining: 10.9s
301:	learn: 3448.0894062	total: 4.7s	remaining: 10.9s
302:	learn: 3447.2957613	total: 4.71s	remaining: 10.8s
303:	learn: 3446.3305802	total: 4.71s	remaining: 10.8s
304:	learn: 3445.5978846	total: 4.72s	remaining: 10.8s
305:	learn: 3444.6639425	total: 4.73s	remaining: 10.7s
306:	learn: 3444.1187856	total: 4.74s	remaining: 10.7s
307:	learn: 3443.4898101	total: 4.75s	remaining: 10.7s
308:	learn: 3442.9818281	total: 4.76s	remaining: 10.6s
309:	learn: 3441.9165210	total: 4.77s	remaining: 10.6s
310:	learn: 3441.5226739	total: 4.78s	remaining: 10.6s
311:	learn: 3441.1182348	total: 4.79s	remaining: 10.6s
312:	learn: 3440.6926435	total: 4.8s	remaining: 10.5s
313:	learn: 3440.1338969	total: 4.8s	remaining: 10.5s
314:	learn: 3439.2354293	total: 4.81s	remaining: 10.5s
315:	learn: 3438.6949745	total: 4.83s	remaining: 10.5s
316:	learn: 3437.5045981	total: 4.84s	remaining: 10.4s
317:	learn: 3437.0886777	total: 4.85s	remaining: 10.4s
318:	learn: 3436.3936569	total: 4.86s	remaining: 10.4s
319:	learn: 3435.4342119	total: 4.87s	remaining: 10.4s
320:	learn: 3434.9415366	total: 4.88s	remaining: 10.3s
321:	learn: 3434.3328592	total: 4.89s	remaining: 10.3s
322:	learn: 3433.8319826	total: 4.9s	remaining: 10.3s
323:	learn: 3433.2332792	total: 4.91s	remaining: 10.2s
324:	learn: 3432.4309083	total: 4.92s	remaining: 10.2s
325:	learn: 3432.0195727	total: 4.93s	remaining: 10.2s
326:	learn: 3431.3093140	total: 4.94s	remaining: 10.2s
327:	learn: 3430.2374459	total: 4.95s	remaining: 10.1s
328:	learn: 3429.7071883	total: 4.96s	remaining: 10.1s
329:	learn: 3429.2016615	total: 4.97s	remaining: 10.1s
330:	learn: 3428.4272503	total: 4.98s	remaining: 10.1s
331:	learn: 3427.6584269	total: 4.99s	remaining: 10s
332:	learn: 3426.3413913	total: 5s	remaining: 10s
333:	learn: 3425.2137959	total: 5.01s	remaining: 9.98s
334:	learn: 3424.6588513	total: 5.01s	remaining: 9.96s
335:	learn: 3423.9890262	total: 5.03s	remaining: 9.93s
336:	learn: 3423.0792332	total: 5.04s	remaining: 9.92s
337:	learn: 3421.9505734	total: 5.05s	remaining: 9.89s
338:	learn: 3421.3726122	total: 5.06s	remaining: 9.87s
339:	learn: 3420.7963721	total: 5.07s	remaining: 9.84s
340:	learn: 3420.1334294	total: 5.08s	remaining: 9.81s
341:	learn: 3419.2483251	total: 5.09s	remaining: 9.79s
342:	learn: 3418.4475132	total: 5.1s	remaining: 9.76s
343:	learn: 3417.4339253	total: 5.11s	remaining: 9.73s
344:	learn: 3417.0595510	total: 5.11s	remaining: 9.71s
345:	learn: 3416.3875419	total: 5.12s	remaining: 9.68s
346:	learn: 3415.2413530	total: 5.13s	remaining: 9.66s
347:	learn: 3413.8147813	total: 5.14s	remaining: 9.63s
348:	learn: 3412.4772992	total: 5.15s	remaining: 9.61s
349:	learn: 3411.7908321	total: 5.16s	remaining: 9.58s
350:	learn: 3410.4263180	total: 5.17s	remaining: 9.56s
351:	learn: 3409.7734767	total: 5.18s	remaining: 9.53s
352:	learn: 3408.9065093	total: 5.19s	remaining: 9.51s
353:	learn: 3408.1033877	total: 5.2s	remaining: 9.48s
354:	learn: 3407.0769349	total: 5.21s	remaining: 9.46s
355:	learn: 3406.4103508	total: 5.21s	remaining: 9.43s
356:	learn: 3405.5175607	total: 5.22s	remaining: 9.41s
357:	learn: 3404.8671759	total: 5.23s	remaining: 9.38s
358:	learn: 3403.8289318	total: 5.24s	remaining: 9.36s
359:	learn: 3403.1527522	total: 5.26s	remaining: 9.35s
360:	learn: 3402.3165277	total: 5.27s	remaining: 9.33s
361:	learn: 3401.4423411	total: 5.28s	remaining: 9.3s
362:	learn: 3400.4284266	total: 5.29s	remaining: 9.28s
363:	learn: 3399.4962108	total: 5.3s	remaining: 9.26s
364:	learn: 3399.1437000	total: 5.31s	remaining: 9.24s
365:	learn: 3398.6506354	total: 5.32s	remaining: 9.21s
366:	learn: 3397.9970785	total: 5.33s	remaining: 9.19s
367:	learn: 3397.2980574	total: 5.34s	remaining: 9.17s
368:	learn: 3396.4488873	total: 5.35s	remaining: 9.15s
369:	learn: 3395.4132617	total: 5.36s	remaining: 9.12s
370:	learn: 3394.9008046	total: 5.37s	remaining: 9.1s
371:	learn: 3394.2922103	total: 5.38s	remaining: 9.08s
372:	learn: 3393.6848927	total: 5.38s	remaining: 9.05s
373:	learn: 3392.7518457	total: 5.41s	remaining: 9.05s
374:	learn: 3392.2569084	total: 5.41s	remaining: 9.02s
375:	learn: 3391.6387366	total: 5.42s	remaining: 9s
376:	learn: 3390.8163765	total: 5.43s	remaining: 8.98s
377:	learn: 3390.2555521	total: 5.44s	remaining: 8.95s
378:	learn: 3389.5827454	total: 5.46s	remaining: 8.94s
379:	learn: 3388.1394101	total: 5.47s	remaining: 8.92s
380:	learn: 3387.4873463	total: 5.48s	remaining: 8.9s
381:	learn: 3386.1362705	total: 5.49s	remaining: 8.87s
382:	learn: 3385.2685610	total: 5.5s	remaining: 8.85s
383:	learn: 3384.6448290	total: 5.5s	remaining: 8.83s
384:	learn: 3383.8509106	total: 5.51s	remaining: 8.8s
385:	learn: 3383.1215695	total: 5.52s	remaining: 8.78s
386:	learn: 3382.2415768	total: 5.53s	remaining: 8.76s
387:	learn: 3381.3736330	total: 5.54s	remaining: 8.74s
388:	learn: 3380.9456735	total: 5.55s	remaining: 8.71s
389:	learn: 3380.0119429	total: 5.56s	remaining: 8.69s
390:	learn: 3379.5606974	total: 5.57s	remaining: 8.67s
391:	learn: 3378.9248119	total: 5.58s	remaining: 8.65s
392:	learn: 3378.3196998	total: 5.58s	remaining: 8.62s
393:	learn: 3377.5955682	total: 5.59s	remaining: 8.6s
394:	learn: 3376.8584104	total: 5.6s	remaining: 8.58s
395:	learn: 3376.1105904	total: 5.61s	remaining: 8.56s
396:	learn: 3375.5601093	total: 5.62s	remaining: 8.54s
397:	learn: 3374.8989930	total: 5.63s	remaining: 8.51s
398:	learn: 3373.5210369	total: 5.64s	remaining: 8.49s
399:	learn: 3372.6053768	total: 5.65s	remaining: 8.47s
400:	learn: 3371.7727740	total: 5.66s	remaining: 8.46s
401:	learn: 3371.0880258	total: 5.68s	remaining: 8.45s
402:	learn: 3370.5011573	total: 5.69s	remaining: 8.43s
403:	learn: 3369.6259119	total: 5.7s	remaining: 8.4s
404:	learn: 3369.0662637	total: 5.71s	remaining: 8.38s
405:	learn: 3368.4095794	total: 5.71s	remaining: 8.36s
406:	learn: 3367.6564686	total: 5.72s	remaining: 8.34s
407:	learn: 3367.1902688	total: 5.73s	remaining: 8.32s
408:	learn: 3366.2111843	total: 5.74s	remaining: 8.3s
409:	learn: 3364.9990594	total: 5.75s	remaining: 8.28s
410:	learn: 3364.2923801	total: 5.76s	remaining: 8.26s
411:	learn: 3363.5493372	total: 5.77s	remaining: 8.24s
412:	learn: 3362.7492504	total: 5.78s	remaining: 8.22s
413:	learn: 3362.2661577	total: 5.79s	remaining: 8.19s
414:	learn: 3361.5752013	total: 5.8s	remaining: 8.17s
415:	learn: 3360.7606785	total: 5.81s	remaining: 8.15s
416:	learn: 3359.9001044	total: 5.82s	remaining: 8.13s
417:	learn: 3359.1688513	total: 5.83s	remaining: 8.11s
418:	learn: 3358.2305630	total: 5.83s	remaining: 8.09s
419:	learn: 3357.6315782	total: 5.84s	remaining: 8.07s
420:	learn: 3357.2251020	total: 5.85s	remaining: 8.05s
421:	learn: 3356.8062833	total: 5.87s	remaining: 8.04s
422:	learn: 3356.3850905	total: 5.88s	remaining: 8.02s
423:	learn: 3355.9532553	total: 5.89s	remaining: 8s
424:	learn: 3355.5381376	total: 5.9s	remaining: 7.98s
425:	learn: 3354.9628654	total: 5.91s	remaining: 7.96s
426:	learn: 3354.4032447	total: 5.91s	remaining: 7.94s
427:	learn: 3354.0600486	total: 5.92s	remaining: 7.92s
428:	learn: 3353.2961330	total: 5.93s	remaining: 7.89s
429:	learn: 3352.9897213	total: 5.94s	remaining: 7.88s
430:	learn: 3352.2310160	total: 5.95s	remaining: 7.86s
431:	learn: 3351.7327704	total: 5.96s	remaining: 7.83s
432:	learn: 3350.7493891	total: 5.97s	remaining: 7.82s
433:	learn: 3349.9105358	total: 5.98s	remaining: 7.8s
434:	learn: 3349.5356743	total: 5.99s	remaining: 7.78s
435:	learn: 3349.2890004	total: 6s	remaining: 7.76s
436:	learn: 3348.8388099	total: 6.01s	remaining: 7.74s
437:	learn: 3348.2443996	total: 6.02s	remaining: 7.72s
438:	learn: 3347.5108693	total: 6.03s	remaining: 7.7s
439:	learn: 3347.0607314	total: 6.04s	remaining: 7.68s
440:	learn: 3346.4092948	total: 6.05s	remaining: 7.67s
441:	learn: 3344.9925440	total: 6.06s	remaining: 7.64s
442:	learn: 3343.8174611	total: 6.07s	remaining: 7.64s
443:	learn: 3343.3560306	total: 6.08s	remaining: 7.62s
444:	learn: 3342.9315367	total: 6.09s	remaining: 7.6s
445:	learn: 3342.3616357	total: 6.1s	remaining: 7.58s
446:	learn: 3341.7749955	total: 6.11s	remaining: 7.56s
447:	learn: 3341.4529946	total: 6.12s	remaining: 7.54s
448:	learn: 3340.8606329	total: 6.13s	remaining: 7.52s
449:	learn: 3340.2417469	total: 6.14s	remaining: 7.5s
450:	learn: 3338.9385213	total: 6.15s	remaining: 7.48s
451:	learn: 3338.6732974	total: 6.15s	remaining: 7.46s
452:	learn: 3337.9674235	total: 6.16s	remaining: 7.44s
453:	learn: 3337.3405526	total: 6.17s	remaining: 7.42s
454:	learn: 3336.0494706	total: 6.18s	remaining: 7.4s
455:	learn: 3335.8195622	total: 6.19s	remaining: 7.38s
456:	learn: 3335.6229929	total: 6.2s	remaining: 7.36s
457:	learn: 3335.3818446	total: 6.21s	remaining: 7.34s
458:	learn: 3334.6667870	total: 6.21s	remaining: 7.32s
459:	learn: 3333.9843472	total: 6.22s	remaining: 7.3s
460:	learn: 3333.2225737	total: 6.23s	remaining: 7.29s
461:	learn: 3332.4436833	total: 6.24s	remaining: 7.27s
462:	learn: 3331.9215831	total: 6.25s	remaining: 7.25s
463:	learn: 3331.4659250	total: 6.26s	remaining: 7.24s
464:	learn: 3331.2247925	total: 6.29s	remaining: 7.24s
465:	learn: 3330.6546077	total: 6.3s	remaining: 7.22s
466:	learn: 3329.9156550	total: 6.31s	remaining: 7.21s
467:	learn: 3329.5433979	total: 6.32s	remaining: 7.18s
468:	learn: 3328.8587933	total: 6.33s	remaining: 7.17s
469:	learn: 3328.3168284	total: 6.34s	remaining: 7.15s
470:	learn: 3327.8353508	total: 6.35s	remaining: 7.13s
471:	learn: 3327.0735513	total: 6.36s	remaining: 7.11s
472:	learn: 3326.7970471	total: 6.37s	remaining: 7.09s
473:	learn: 3325.5951548	total: 6.38s	remaining: 7.09s
474:	learn: 3325.0777578	total: 6.39s	remaining: 7.07s
475:	learn: 3324.0961696	total: 6.4s	remaining: 7.05s
476:	learn: 3323.7165051	total: 6.41s	remaining: 7.03s
477:	learn: 3323.3968850	total: 6.42s	remaining: 7.01s
478:	learn: 3322.6364956	total: 6.43s	remaining: 6.99s
479:	learn: 3321.9969914	total: 6.44s	remaining: 6.97s
480:	learn: 3321.3007249	total: 6.45s	remaining: 6.96s
481:	learn: 3320.6224709	total: 6.46s	remaining: 6.94s
482:	learn: 3320.0177536	total: 6.46s	remaining: 6.92s
483:	learn: 3319.3530917	total: 6.47s	remaining: 6.9s
484:	learn: 3318.7377939	total: 6.48s	remaining: 6.88s
485:	learn: 3318.0972431	total: 6.5s	remaining: 6.87s
486:	learn: 3317.5525351	total: 6.51s	remaining: 6.86s
487:	learn: 3317.0267921	total: 6.52s	remaining: 6.84s
488:	learn: 3316.1903874	total: 6.53s	remaining: 6.83s
489:	learn: 3315.6174880	total: 6.54s	remaining: 6.81s
490:	learn: 3315.0963519	total: 6.55s	remaining: 6.79s
491:	learn: 3313.7638084	total: 6.56s	remaining: 6.78s
492:	learn: 3312.8665297	total: 6.57s	remaining: 6.76s
493:	learn: 3312.1535712	total: 6.58s	remaining: 6.74s
494:	learn: 3311.6391290	total: 6.59s	remaining: 6.72s
495:	learn: 3310.7388549	total: 6.6s	remaining: 6.7s
496:	learn: 3309.8830418	total: 6.61s	remaining: 6.69s
497:	learn: 3309.4159037	total: 6.62s	remaining: 6.67s
498:	learn: 3308.8965103	total: 6.63s	remaining: 6.65s
499:	learn: 3308.7787148	total: 6.63s	remaining: 6.63s
500:	learn: 3308.4066573	total: 6.64s	remaining: 6.62s
501:	learn: 3307.8094863	total: 6.65s	remaining: 6.6s
502:	learn: 3306.7071089	total: 6.66s	remaining: 6.58s
503:	learn: 3305.9799480	total: 6.67s	remaining: 6.57s
504:	learn: 3305.6220938	total: 6.69s	remaining: 6.55s
505:	learn: 3304.7279656	total: 6.7s	remaining: 6.54s
506:	learn: 3303.6268344	total: 6.72s	remaining: 6.53s
507:	learn: 3303.2546173	total: 6.73s	remaining: 6.51s
508:	learn: 3302.7246115	total: 6.74s	remaining: 6.5s
509:	learn: 3301.6678016	total: 6.75s	remaining: 6.48s
510:	learn: 3301.0394906	total: 6.75s	remaining: 6.46s
511:	learn: 3300.4460197	total: 6.76s	remaining: 6.45s
512:	learn: 3299.8268032	total: 6.77s	remaining: 6.43s
513:	learn: 3299.4940697	total: 6.78s	remaining: 6.41s
514:	learn: 3298.9522580	total: 6.79s	remaining: 6.39s
515:	learn: 3298.2183701	total: 6.8s	remaining: 6.38s
516:	learn: 3297.7803477	total: 6.81s	remaining: 6.36s
517:	learn: 3297.2587448	total: 6.82s	remaining: 6.34s
518:	learn: 3296.1980846	total: 6.83s	remaining: 6.33s
519:	learn: 3295.4345305	total: 6.83s	remaining: 6.31s
520:	learn: 3295.0947277	total: 6.84s	remaining: 6.29s
521:	learn: 3294.3861830	total: 6.85s	remaining: 6.27s
522:	learn: 3293.7523473	total: 6.86s	remaining: 6.26s
523:	learn: 3293.5998653	total: 6.87s	remaining: 6.24s
524:	learn: 3292.8050181	total: 6.88s	remaining: 6.22s
525:	learn: 3291.8826601	total: 6.89s	remaining: 6.21s
526:	learn: 3291.1871366	total: 6.9s	remaining: 6.19s
527:	learn: 3290.4928323	total: 6.92s	remaining: 6.18s
528:	learn: 3290.2064529	total: 6.92s	remaining: 6.16s
529:	learn: 3289.8633426	total: 6.93s	remaining: 6.15s
530:	learn: 3289.3157970	total: 6.94s	remaining: 6.13s
531:	learn: 3289.0279702	total: 6.95s	remaining: 6.12s
532:	learn: 3287.9435206	total: 6.96s	remaining: 6.1s
533:	learn: 3287.8867498	total: 6.97s	remaining: 6.08s
534:	learn: 3287.6454432	total: 6.98s	remaining: 6.06s
535:	learn: 3286.9521896	total: 6.99s	remaining: 6.05s
536:	learn: 3286.5227466	total: 6.99s	remaining: 6.03s
537:	learn: 3285.8616792	total: 7.01s	remaining: 6.02s
538:	learn: 3285.3853820	total: 7.02s	remaining: 6s
539:	learn: 3284.2703291	total: 7.03s	remaining: 5.99s
540:	learn: 3284.0054193	total: 7.03s	remaining: 5.97s
541:	learn: 3283.2652586	total: 7.04s	remaining: 5.95s
542:	learn: 3283.0513590	total: 7.05s	remaining: 5.93s
543:	learn: 3282.7135575	total: 7.06s	remaining: 5.92s
544:	learn: 3282.2399257	total: 7.07s	remaining: 5.9s
545:	learn: 3281.5610752	total: 7.08s	remaining: 5.88s
546:	learn: 3280.2957098	total: 7.09s	remaining: 5.87s
547:	learn: 3279.7172037	total: 7.1s	remaining: 5.85s
548:	learn: 3279.5686105	total: 7.11s	remaining: 5.84s
549:	learn: 3278.9067625	total: 7.12s	remaining: 5.83s
550:	learn: 3278.2523044	total: 7.13s	remaining: 5.81s
551:	learn: 3278.0626438	total: 7.14s	remaining: 5.79s
552:	learn: 3277.8257832	total: 7.15s	remaining: 5.78s
553:	learn: 3277.6973381	total: 7.16s	remaining: 5.76s
554:	learn: 3277.3026194	total: 7.17s	remaining: 5.75s
555:	learn: 3276.9486410	total: 7.18s	remaining: 5.73s
556:	learn: 3276.6356846	total: 7.19s	remaining: 5.72s
557:	learn: 3275.8332721	total: 7.2s	remaining: 5.7s
558:	learn: 3275.2987493	total: 7.2s	remaining: 5.68s
559:	learn: 3275.0085916	total: 7.21s	remaining: 5.67s
560:	learn: 3273.9694182	total: 7.22s	remaining: 5.65s
561:	learn: 3273.1930306	total: 7.23s	remaining: 5.63s
562:	learn: 3272.4464748	total: 7.24s	remaining: 5.62s
563:	learn: 3271.5349689	total: 7.25s	remaining: 5.6s
564:	learn: 3270.9404118	total: 7.26s	remaining: 5.59s
565:	learn: 3270.3510800	total: 7.27s	remaining: 5.57s
566:	learn: 3269.8198612	total: 7.28s	remaining: 5.56s
567:	learn: 3269.0779416	total: 7.29s	remaining: 5.54s
568:	learn: 3268.5354023	total: 7.29s	remaining: 5.53s
569:	learn: 3267.9619025	total: 7.31s	remaining: 5.52s
570:	learn: 3267.7343891	total: 7.32s	remaining: 5.5s
571:	learn: 3267.2881054	total: 7.33s	remaining: 5.49s
572:	learn: 3266.7974236	total: 7.34s	remaining: 5.47s
573:	learn: 3266.0515903	total: 7.35s	remaining: 5.45s
574:	learn: 3265.3471310	total: 7.36s	remaining: 5.44s
575:	learn: 3264.6625358	total: 7.38s	remaining: 5.43s
576:	learn: 3264.3309078	total: 7.38s	remaining: 5.41s
577:	learn: 3263.4776650	total: 7.39s	remaining: 5.4s
578:	learn: 3263.0927727	total: 7.4s	remaining: 5.38s
579:	learn: 3262.6834152	total: 7.41s	remaining: 5.37s
580:	learn: 3261.6806565	total: 7.42s	remaining: 5.35s
581:	learn: 3261.5796773	total: 7.43s	remaining: 5.33s
582:	learn: 3260.9946684	total: 7.44s	remaining: 5.32s
583:	learn: 3260.4987174	total: 7.45s	remaining: 5.3s
584:	learn: 3260.3339529	total: 7.45s	remaining: 5.29s
585:	learn: 3260.1021788	total: 7.46s	remaining: 5.27s
586:	learn: 3259.2502169	total: 7.47s	remaining: 5.26s
587:	learn: 3258.3865403	total: 7.48s	remaining: 5.24s
588:	learn: 3257.8967623	total: 7.49s	remaining: 5.23s
589:	learn: 3256.9652574	total: 7.5s	remaining: 5.21s
590:	learn: 3256.6913151	total: 7.51s	remaining: 5.2s
591:	learn: 3256.2543060	total: 7.53s	remaining: 5.19s
592:	learn: 3255.9371688	total: 7.54s	remaining: 5.17s
593:	learn: 3255.6230644	total: 7.55s	remaining: 5.16s
594:	learn: 3255.0574592	total: 7.57s	remaining: 5.15s
595:	learn: 3254.4404825	total: 7.57s	remaining: 5.13s
596:	learn: 3254.0264811	total: 7.58s	remaining: 5.12s
597:	learn: 3253.7221626	total: 7.59s	remaining: 5.1s
598:	learn: 3253.3113228	total: 7.6s	remaining: 5.09s
599:	learn: 3252.7325549	total: 7.61s	remaining: 5.07s
600:	learn: 3252.2357451	total: 7.62s	remaining: 5.06s
601:	learn: 3251.5754593	total: 7.63s	remaining: 5.04s
602:	learn: 3251.1955746	total: 7.64s	remaining: 5.03s
603:	learn: 3250.5857237	total: 7.64s	remaining: 5.01s
604:	learn: 3250.1478912	total: 7.66s	remaining: 5s
605:	learn: 3249.8878418	total: 7.66s	remaining: 4.98s
606:	learn: 3249.6315149	total: 7.67s	remaining: 4.97s
607:	learn: 3249.4322972	total: 7.68s	remaining: 4.95s
608:	learn: 3248.9914224	total: 7.69s	remaining: 4.94s
609:	learn: 3248.6298056	total: 7.7s	remaining: 4.92s
610:	learn: 3248.3950424	total: 7.71s	remaining: 4.91s
611:	learn: 3247.9356194	total: 7.71s	remaining: 4.89s
612:	learn: 3247.5357311	total: 7.72s	remaining: 4.88s
613:	learn: 3247.2981576	total: 7.74s	remaining: 4.87s
614:	learn: 3247.0955438	total: 7.75s	remaining: 4.85s
615:	learn: 3246.4288459	total: 7.76s	remaining: 4.83s
616:	learn: 3246.0453331	total: 7.77s	remaining: 4.82s
617:	learn: 3245.5492798	total: 7.78s	remaining: 4.81s
618:	learn: 3244.9469035	total: 7.78s	remaining: 4.79s
619:	learn: 3244.4355209	total: 7.79s	remaining: 4.78s
620:	learn: 3243.6120767	total: 7.8s	remaining: 4.76s
621:	learn: 3243.0033950	total: 7.81s	remaining: 4.75s
622:	learn: 3242.4068172	total: 7.82s	remaining: 4.73s
623:	learn: 3241.7553981	total: 7.83s	remaining: 4.72s
624:	learn: 3241.3995640	total: 7.84s	remaining: 4.7s
625:	learn: 3240.7721452	total: 7.85s	remaining: 4.69s
626:	learn: 3240.3250810	total: 7.86s	remaining: 4.67s
627:	learn: 3240.0936452	total: 7.87s	remaining: 4.66s
628:	learn: 3239.5973139	total: 7.88s	remaining: 4.64s
629:	learn: 3239.2973062	total: 7.88s	remaining: 4.63s
630:	learn: 3238.7523010	total: 7.9s	remaining: 4.62s
631:	learn: 3238.4031594	total: 7.91s	remaining: 4.61s
632:	learn: 3237.9026079	total: 7.92s	remaining: 4.59s
633:	learn: 3237.2706547	total: 7.93s	remaining: 4.58s
634:	learn: 3236.8642206	total: 7.94s	remaining: 4.56s
635:	learn: 3236.5473588	total: 7.95s	remaining: 4.55s
636:	learn: 3236.1793693	total: 7.96s	remaining: 4.54s
637:	learn: 3235.4784656	total: 7.97s	remaining: 4.52s
638:	learn: 3235.2324173	total: 7.98s	remaining: 4.51s
639:	learn: 3234.9080929	total: 7.99s	remaining: 4.49s
640:	learn: 3234.3163962	total: 8s	remaining: 4.48s
641:	learn: 3233.8844997	total: 8s	remaining: 4.46s
642:	learn: 3233.5995203	total: 8.01s	remaining: 4.45s
643:	learn: 3233.2356338	total: 8.02s	remaining: 4.43s
644:	learn: 3232.6763457	total: 8.03s	remaining: 4.42s
645:	learn: 3232.1715641	total: 8.04s	remaining: 4.41s
646:	learn: 3231.6368672	total: 8.05s	remaining: 4.39s
647:	learn: 3231.2449095	total: 8.06s	remaining: 4.38s
648:	learn: 3230.7229122	total: 8.07s	remaining: 4.36s
649:	learn: 3230.3102653	total: 8.08s	remaining: 4.35s
650:	learn: 3229.7640224	total: 8.09s	remaining: 4.33s
651:	learn: 3229.3973492	total: 8.09s	remaining: 4.32s
652:	learn: 3229.1295891	total: 8.1s	remaining: 4.31s
653:	learn: 3228.7382695	total: 8.11s	remaining: 4.29s
654:	learn: 3228.2802706	total: 8.12s	remaining: 4.28s
655:	learn: 3227.9430307	total: 8.13s	remaining: 4.26s
656:	learn: 3227.5803859	total: 8.15s	remaining: 4.25s
657:	learn: 3227.1135152	total: 8.16s	remaining: 4.24s
658:	learn: 3226.7487809	total: 8.16s	remaining: 4.22s
659:	learn: 3226.0802278	total: 8.18s	remaining: 4.21s
660:	learn: 3225.5426610	total: 8.19s	remaining: 4.2s
661:	learn: 3225.0313231	total: 8.2s	remaining: 4.18s
662:	learn: 3224.5425990	total: 8.21s	remaining: 4.17s
663:	learn: 3224.3535943	total: 8.22s	remaining: 4.16s
664:	learn: 3223.8253522	total: 8.23s	remaining: 4.14s
665:	learn: 3223.5263191	total: 8.24s	remaining: 4.13s
666:	learn: 3223.0138849	total: 8.25s	remaining: 4.12s
667:	learn: 3222.6941659	total: 8.26s	remaining: 4.1s
668:	learn: 3221.7827596	total: 8.26s	remaining: 4.09s
669:	learn: 3221.3237396	total: 8.27s	remaining: 4.07s
670:	learn: 3221.0067214	total: 8.28s	remaining: 4.06s
671:	learn: 3220.7230307	total: 8.29s	remaining: 4.05s
672:	learn: 3220.3047808	total: 8.3s	remaining: 4.03s
673:	learn: 3220.1805060	total: 8.31s	remaining: 4.02s
674:	learn: 3219.4748116	total: 8.32s	remaining: 4s
675:	learn: 3218.9114684	total: 8.33s	remaining: 3.99s
676:	learn: 3218.5865043	total: 8.34s	remaining: 3.98s
677:	learn: 3217.7981030	total: 8.36s	remaining: 3.97s
678:	learn: 3217.3484617	total: 8.38s	remaining: 3.96s
679:	learn: 3217.2136209	total: 8.39s	remaining: 3.95s
680:	learn: 3216.5978795	total: 8.4s	remaining: 3.93s
681:	learn: 3216.3603004	total: 8.4s	remaining: 3.92s
682:	learn: 3215.5296479	total: 8.41s	remaining: 3.9s
683:	learn: 3215.2054688	total: 8.42s	remaining: 3.89s
684:	learn: 3214.7302304	total: 8.43s	remaining: 3.88s
685:	learn: 3214.2705316	total: 8.44s	remaining: 3.86s
686:	learn: 3213.6018464	total: 8.45s	remaining: 3.85s
687:	learn: 3213.2075368	total: 8.47s	remaining: 3.84s
688:	learn: 3212.5472642	total: 8.47s	remaining: 3.83s
689:	learn: 3212.2854596	total: 8.48s	remaining: 3.81s
690:	learn: 3212.1438253	total: 8.49s	remaining: 3.8s
691:	learn: 3211.5892101	total: 8.5s	remaining: 3.78s
692:	learn: 3211.2411029	total: 8.51s	remaining: 3.77s
693:	learn: 3210.7545027	total: 8.52s	remaining: 3.75s
694:	learn: 3210.4066670	total: 8.53s	remaining: 3.74s
695:	learn: 3210.0781235	total: 8.54s	remaining: 3.73s
696:	learn: 3209.3322370	total: 8.54s	remaining: 3.71s
697:	learn: 3208.7323598	total: 8.55s	remaining: 3.7s
698:	learn: 3208.3467246	total: 8.57s	remaining: 3.69s
699:	learn: 3208.2235426	total: 8.58s	remaining: 3.68s
700:	learn: 3208.0846987	total: 8.59s	remaining: 3.66s
701:	learn: 3207.6799378	total: 8.6s	remaining: 3.65s
702:	learn: 3207.5758862	total: 8.6s	remaining: 3.63s
703:	learn: 3206.9749776	total: 8.61s	remaining: 3.62s
704:	learn: 3206.6357008	total: 8.62s	remaining: 3.61s
705:	learn: 3206.0019693	total: 8.63s	remaining: 3.59s
706:	learn: 3205.8834638	total: 8.64s	remaining: 3.58s
707:	learn: 3205.6951937	total: 8.65s	remaining: 3.57s
708:	learn: 3205.0718167	total: 8.66s	remaining: 3.55s
709:	learn: 3203.9434712	total: 8.67s	remaining: 3.54s
710:	learn: 3203.7399601	total: 8.68s	remaining: 3.53s
711:	learn: 3203.2856312	total: 8.68s	remaining: 3.51s
712:	learn: 3202.5586219	total: 8.69s	remaining: 3.5s
713:	learn: 3201.8471716	total: 8.7s	remaining: 3.49s
714:	learn: 3201.2983384	total: 8.71s	remaining: 3.47s
715:	learn: 3200.8287275	total: 8.72s	remaining: 3.46s
716:	learn: 3200.5841724	total: 8.73s	remaining: 3.45s
717:	learn: 3200.2427913	total: 8.74s	remaining: 3.43s
718:	learn: 3199.5669808	total: 8.75s	remaining: 3.42s
719:	learn: 3198.9892951	total: 8.76s	remaining: 3.41s
720:	learn: 3198.6896488	total: 8.78s	remaining: 3.4s
721:	learn: 3198.3706655	total: 8.79s	remaining: 3.38s
722:	learn: 3197.6942155	total: 8.8s	remaining: 3.37s
723:	learn: 3196.9994947	total: 8.81s	remaining: 3.36s
724:	learn: 3196.6066221	total: 8.82s	remaining: 3.34s
725:	learn: 3195.8472295	total: 8.83s	remaining: 3.33s
726:	learn: 3195.3960710	total: 8.84s	remaining: 3.32s
727:	learn: 3194.9142456	total: 8.84s	remaining: 3.3s
728:	learn: 3194.5470263	total: 8.85s	remaining: 3.29s
729:	learn: 3193.4505283	total: 8.86s	remaining: 3.28s
730:	learn: 3192.9974479	total: 8.87s	remaining: 3.27s
731:	learn: 3192.5020223	total: 8.88s	remaining: 3.25s
732:	learn: 3192.0355096	total: 8.89s	remaining: 3.24s
733:	learn: 3191.5248180	total: 8.9s	remaining: 3.23s
734:	learn: 3191.1253311	total: 8.91s	remaining: 3.21s
735:	learn: 3191.0435915	total: 8.92s	remaining: 3.2s
736:	learn: 3190.3732600	total: 8.93s	remaining: 3.19s
737:	learn: 3189.7591955	total: 8.94s	remaining: 3.17s
738:	learn: 3189.1731676	total: 8.95s	remaining: 3.16s
739:	learn: 3188.8608872	total: 8.95s	remaining: 3.15s
740:	learn: 3188.6165321	total: 8.96s	remaining: 3.13s
741:	learn: 3188.3718214	total: 8.97s	remaining: 3.12s
742:	learn: 3188.0007388	total: 8.99s	remaining: 3.11s
743:	learn: 3187.2611669	total: 9s	remaining: 3.1s
744:	learn: 3187.0424449	total: 9s	remaining: 3.08s
745:	learn: 3186.8077901	total: 9.01s	remaining: 3.07s
746:	learn: 3186.5310202	total: 9.02s	remaining: 3.06s
747:	learn: 3185.6916070	total: 9.03s	remaining: 3.04s
748:	learn: 3185.2913173	total: 9.04s	remaining: 3.03s
749:	learn: 3185.0254790	total: 9.05s	remaining: 3.02s
750:	learn: 3184.5238954	total: 9.06s	remaining: 3s
751:	learn: 3184.1147650	total: 9.07s	remaining: 2.99s
752:	learn: 3183.7594970	total: 9.08s	remaining: 2.98s
753:	learn: 3183.5145273	total: 9.09s	remaining: 2.96s
754:	learn: 3183.2324407	total: 9.09s	remaining: 2.95s
755:	learn: 3183.0097546	total: 9.1s	remaining: 2.94s
756:	learn: 3182.7887249	total: 9.11s	remaining: 2.92s
757:	learn: 3182.2902571	total: 9.12s	remaining: 2.91s
758:	learn: 3181.8181664	total: 9.13s	remaining: 2.9s
759:	learn: 3181.6035136	total: 9.14s	remaining: 2.89s
760:	learn: 3181.1270108	total: 9.15s	remaining: 2.88s
761:	learn: 3180.6953972	total: 9.16s	remaining: 2.86s
762:	learn: 3180.4164580	total: 9.17s	remaining: 2.85s
763:	learn: 3179.7521890	total: 9.19s	remaining: 2.84s
764:	learn: 3179.3867310	total: 9.2s	remaining: 2.83s
765:	learn: 3179.1759277	total: 9.21s	remaining: 2.81s
766:	learn: 3178.9104288	total: 9.22s	remaining: 2.8s
767:	learn: 3178.3023970	total: 9.23s	remaining: 2.79s
768:	learn: 3177.3169794	total: 9.24s	remaining: 2.77s
769:	learn: 3177.0064752	total: 9.25s	remaining: 2.76s
770:	learn: 3176.3720885	total: 9.26s	remaining: 2.75s
771:	learn: 3176.0291559	total: 9.26s	remaining: 2.74s
772:	learn: 3175.7316974	total: 9.27s	remaining: 2.72s
773:	learn: 3175.4343626	total: 9.28s	remaining: 2.71s
774:	learn: 3175.0789832	total: 9.29s	remaining: 2.7s
775:	learn: 3174.9251750	total: 9.3s	remaining: 2.68s
776:	learn: 3174.6141786	total: 9.31s	remaining: 2.67s
777:	learn: 3174.1943108	total: 9.32s	remaining: 2.66s
778:	learn: 3173.3633956	total: 9.33s	remaining: 2.65s
779:	learn: 3172.7899414	total: 9.34s	remaining: 2.63s
780:	learn: 3172.7042161	total: 9.35s	remaining: 2.62s
781:	learn: 3172.1305782	total: 9.36s	remaining: 2.61s
782:	learn: 3171.8387117	total: 9.37s	remaining: 2.6s
783:	learn: 3171.4964210	total: 9.38s	remaining: 2.58s
784:	learn: 3171.1990558	total: 9.39s	remaining: 2.57s
785:	learn: 3170.7444499	total: 9.4s	remaining: 2.56s
786:	learn: 3170.2666339	total: 9.41s	remaining: 2.55s
787:	learn: 3169.7967033	total: 9.42s	remaining: 2.53s
788:	learn: 3169.5350752	total: 9.43s	remaining: 2.52s
789:	learn: 3169.1555794	total: 9.44s	remaining: 2.51s
790:	learn: 3168.8532479	total: 9.45s	remaining: 2.5s
791:	learn: 3168.6783446	total: 9.46s	remaining: 2.48s
792:	learn: 3168.3214460	total: 9.46s	remaining: 2.47s
793:	learn: 3167.9115976	total: 9.47s	remaining: 2.46s
794:	learn: 3167.6049295	total: 9.48s	remaining: 2.44s
795:	learn: 3167.2509697	total: 9.49s	remaining: 2.43s
796:	learn: 3166.6996448	total: 9.5s	remaining: 2.42s
797:	learn: 3166.1134510	total: 9.51s	remaining: 2.41s
798:	learn: 3165.7195089	total: 9.52s	remaining: 2.4s
799:	learn: 3165.2360561	total: 9.53s	remaining: 2.38s
800:	learn: 3165.2181082	total: 9.54s	remaining: 2.37s
801:	learn: 3164.7717764	total: 9.54s	remaining: 2.36s
802:	learn: 3164.5850120	total: 9.55s	remaining: 2.34s
803:	learn: 3164.2106953	total: 9.56s	remaining: 2.33s
804:	learn: 3163.8610367	total: 9.57s	remaining: 2.32s
805:	learn: 3163.4009638	total: 9.58s	remaining: 2.31s
806:	learn: 3163.0965919	total: 9.6s	remaining: 2.3s
807:	learn: 3162.8073985	total: 9.61s	remaining: 2.28s
808:	learn: 3161.9936020	total: 9.62s	remaining: 2.27s
809:	learn: 3161.7538179	total: 9.63s	remaining: 2.26s
810:	learn: 3161.2279096	total: 9.64s	remaining: 2.25s
811:	learn: 3160.6803745	total: 9.65s	remaining: 2.23s
812:	learn: 3160.4793037	total: 9.66s	remaining: 2.22s
813:	learn: 3160.1887812	total: 9.66s	remaining: 2.21s
814:	learn: 3159.6912523	total: 9.67s	remaining: 2.19s
815:	learn: 3159.2343781	total: 9.68s	remaining: 2.18s
816:	learn: 3158.6855275	total: 9.69s	remaining: 2.17s
817:	learn: 3158.6074228	total: 9.7s	remaining: 2.16s
818:	learn: 3158.2616445	total: 9.71s	remaining: 2.15s
819:	learn: 3157.3856825	total: 9.72s	remaining: 2.13s
820:	learn: 3156.9879509	total: 9.73s	remaining: 2.12s
821:	learn: 3156.6521100	total: 9.74s	remaining: 2.11s
822:	learn: 3156.2998086	total: 9.74s	remaining: 2.1s
823:	learn: 3155.8780545	total: 9.75s	remaining: 2.08s
824:	learn: 3155.4157835	total: 9.76s	remaining: 2.07s
825:	learn: 3155.2792720	total: 9.77s	remaining: 2.06s
826:	learn: 3154.9999282	total: 9.79s	remaining: 2.05s
827:	learn: 3154.5721169	total: 9.8s	remaining: 2.04s
828:	learn: 3154.2368121	total: 9.81s	remaining: 2.02s
829:	learn: 3153.8093805	total: 9.82s	remaining: 2.01s
830:	learn: 3153.5297367	total: 9.83s	remaining: 2s
831:	learn: 3153.2240898	total: 9.84s	remaining: 1.99s
832:	learn: 3152.5567761	total: 9.85s	remaining: 1.97s
833:	learn: 3152.3826045	total: 9.85s	remaining: 1.96s
834:	learn: 3152.1372971	total: 9.86s	remaining: 1.95s
835:	learn: 3151.8631928	total: 9.87s	remaining: 1.94s
836:	learn: 3151.6076901	total: 9.88s	remaining: 1.92s
837:	learn: 3151.3529227	total: 9.89s	remaining: 1.91s
838:	learn: 3150.9598206	total: 9.9s	remaining: 1.9s
839:	learn: 3150.4426446	total: 9.91s	remaining: 1.89s
840:	learn: 3149.9337368	total: 9.91s	remaining: 1.87s
841:	learn: 3149.8010830	total: 9.93s	remaining: 1.86s
842:	learn: 3149.5728962	total: 9.94s	remaining: 1.85s
843:	learn: 3148.9044009	total: 9.95s	remaining: 1.84s
844:	learn: 3148.6608285	total: 9.96s	remaining: 1.83s
845:	learn: 3148.2797987	total: 9.96s	remaining: 1.81s
846:	learn: 3148.0184732	total: 9.97s	remaining: 1.8s
847:	learn: 3147.4392594	total: 9.99s	remaining: 1.79s
848:	learn: 3147.2863896	total: 10s	remaining: 1.78s
849:	learn: 3146.8831270	total: 10s	remaining: 1.77s
850:	learn: 3146.2787283	total: 10s	remaining: 1.75s
851:	learn: 3145.9536431	total: 10s	remaining: 1.74s
852:	learn: 3145.5894715	total: 10s	remaining: 1.73s
853:	learn: 3145.4442015	total: 10.1s	remaining: 1.72s
854:	learn: 3144.8448511	total: 10.1s	remaining: 1.71s
855:	learn: 3144.2343863	total: 10.1s	remaining: 1.69s
856:	learn: 3143.6554439	total: 10.1s	remaining: 1.68s
857:	learn: 3143.3313731	total: 10.1s	remaining: 1.67s
858:	learn: 3143.1587503	total: 10.1s	remaining: 1.66s
859:	learn: 3143.0428877	total: 10.1s	remaining: 1.64s
860:	learn: 3142.6899364	total: 10.1s	remaining: 1.63s
861:	learn: 3142.3544088	total: 10.1s	remaining: 1.62s
862:	learn: 3142.0210124	total: 10.1s	remaining: 1.61s
863:	learn: 3141.6672784	total: 10.1s	remaining: 1.59s
864:	learn: 3141.6012098	total: 10.1s	remaining: 1.58s
865:	learn: 3141.2078581	total: 10.2s	remaining: 1.57s
866:	learn: 3140.4246373	total: 10.2s	remaining: 1.56s
867:	learn: 3140.0978898	total: 10.2s	remaining: 1.55s
868:	learn: 3139.5377024	total: 10.2s	remaining: 1.53s
869:	learn: 3139.0737035	total: 10.2s	remaining: 1.52s
870:	learn: 3138.5540985	total: 10.2s	remaining: 1.51s
871:	learn: 3138.2031998	total: 10.2s	remaining: 1.5s
872:	learn: 3137.6547840	total: 10.2s	remaining: 1.49s
873:	learn: 3137.3594645	total: 10.2s	remaining: 1.48s
874:	learn: 3136.9085913	total: 10.3s	remaining: 1.47s
875:	learn: 3136.5708664	total: 10.3s	remaining: 1.45s
876:	learn: 3136.2167377	total: 10.3s	remaining: 1.44s
877:	learn: 3135.6741112	total: 10.3s	remaining: 1.43s
878:	learn: 3135.5771213	total: 10.3s	remaining: 1.42s
879:	learn: 3135.2110632	total: 10.3s	remaining: 1.41s
880:	learn: 3134.9940242	total: 10.3s	remaining: 1.4s
881:	learn: 3134.8431741	total: 10.3s	remaining: 1.38s
882:	learn: 3134.7404386	total: 10.3s	remaining: 1.37s
883:	learn: 3134.3931987	total: 10.4s	remaining: 1.36s
884:	learn: 3134.1918321	total: 10.4s	remaining: 1.35s
885:	learn: 3134.0009322	total: 10.4s	remaining: 1.33s
886:	learn: 3133.7212187	total: 10.4s	remaining: 1.32s
887:	learn: 3133.2729282	total: 10.4s	remaining: 1.31s
888:	learn: 3132.8377208	total: 10.4s	remaining: 1.3s
889:	learn: 3132.2157271	total: 10.4s	remaining: 1.29s
890:	learn: 3131.9798193	total: 10.4s	remaining: 1.28s
891:	learn: 3131.7323035	total: 10.4s	remaining: 1.26s
892:	learn: 3131.5178651	total: 10.4s	remaining: 1.25s
893:	learn: 3130.8921260	total: 10.5s	remaining: 1.24s
894:	learn: 3130.3214201	total: 10.5s	remaining: 1.23s
895:	learn: 3130.0887647	total: 10.5s	remaining: 1.22s
896:	learn: 3129.6006358	total: 10.5s	remaining: 1.2s
897:	learn: 3129.3817384	total: 10.5s	remaining: 1.19s
898:	learn: 3128.9348759	total: 10.5s	remaining: 1.18s
899:	learn: 3128.2403843	total: 10.5s	remaining: 1.17s
900:	learn: 3127.5019215	total: 10.5s	remaining: 1.16s
901:	learn: 3127.0495656	total: 10.5s	remaining: 1.14s
902:	learn: 3126.4660084	total: 10.5s	remaining: 1.13s
903:	learn: 3126.2799942	total: 10.6s	remaining: 1.12s
904:	learn: 3126.0693837	total: 10.6s	remaining: 1.11s
905:	learn: 3125.6746016	total: 10.6s	remaining: 1.1s
906:	learn: 3125.0968711	total: 10.6s	remaining: 1.08s
907:	learn: 3124.9556829	total: 10.6s	remaining: 1.07s
908:	learn: 3124.3088110	total: 10.6s	remaining: 1.06s
909:	learn: 3123.9315232	total: 10.6s	remaining: 1.05s
910:	learn: 3123.3453330	total: 10.6s	remaining: 1.04s
911:	learn: 3123.1303765	total: 10.6s	remaining: 1.03s
912:	learn: 3122.8376868	total: 10.6s	remaining: 1.01s
913:	learn: 3122.5334274	total: 10.7s	remaining: 1s
914:	learn: 3121.9607207	total: 10.7s	remaining: 990ms
915:	learn: 3121.6763716	total: 10.7s	remaining: 979ms
916:	learn: 3121.0456266	total: 10.7s	remaining: 967ms
917:	learn: 3120.8539345	total: 10.7s	remaining: 955ms
918:	learn: 3120.6533087	total: 10.7s	remaining: 943ms
919:	learn: 3120.4591949	total: 10.7s	remaining: 931ms
920:	learn: 3120.1164424	total: 10.7s	remaining: 919ms
921:	learn: 3119.8643674	total: 10.7s	remaining: 907ms
922:	learn: 3119.6649277	total: 10.7s	remaining: 895ms
923:	learn: 3119.1773703	total: 10.7s	remaining: 884ms
924:	learn: 3119.0149567	total: 10.8s	remaining: 872ms
925:	learn: 3118.6436112	total: 10.8s	remaining: 860ms
926:	learn: 3118.2233763	total: 10.8s	remaining: 848ms
927:	learn: 3117.9905120	total: 10.8s	remaining: 836ms
928:	learn: 3117.7078519	total: 10.8s	remaining: 824ms
929:	learn: 3117.5357810	total: 10.8s	remaining: 813ms
930:	learn: 3117.1929779	total: 10.8s	remaining: 802ms
931:	learn: 3116.6032330	total: 10.8s	remaining: 790ms
932:	learn: 3116.2400480	total: 10.8s	remaining: 778ms
933:	learn: 3115.9438796	total: 10.8s	remaining: 766ms
934:	learn: 3115.5222585	total: 10.9s	remaining: 755ms
935:	learn: 3115.4039250	total: 10.9s	remaining: 743ms
936:	learn: 3115.1213797	total: 10.9s	remaining: 731ms
937:	learn: 3114.8357705	total: 10.9s	remaining: 719ms
938:	learn: 3114.6313852	total: 10.9s	remaining: 708ms
939:	learn: 3114.1453134	total: 10.9s	remaining: 696ms
940:	learn: 3114.0564535	total: 10.9s	remaining: 684ms
941:	learn: 3113.8814780	total: 10.9s	remaining: 672ms
942:	learn: 3113.5554502	total: 10.9s	remaining: 660ms
943:	learn: 3113.1077701	total: 10.9s	remaining: 649ms
944:	learn: 3112.5181319	total: 10.9s	remaining: 637ms
945:	learn: 3112.2939984	total: 11s	remaining: 625ms
946:	learn: 3111.9209016	total: 11s	remaining: 613ms
947:	learn: 3111.2433107	total: 11s	remaining: 602ms
948:	learn: 3110.8309808	total: 11s	remaining: 590ms
949:	learn: 3110.5922338	total: 11s	remaining: 578ms
950:	learn: 3110.1984083	total: 11s	remaining: 567ms
951:	learn: 3109.9746457	total: 11s	remaining: 555ms
952:	learn: 3109.7532360	total: 11s	remaining: 544ms
953:	learn: 3109.4088712	total: 11s	remaining: 532ms
954:	learn: 3109.2129081	total: 11s	remaining: 520ms
955:	learn: 3108.9873741	total: 11s	remaining: 509ms
956:	learn: 3108.6431260	total: 11.1s	remaining: 497ms
957:	learn: 3108.4844251	total: 11.1s	remaining: 485ms
958:	learn: 3108.1521559	total: 11.1s	remaining: 474ms
959:	learn: 3108.0142746	total: 11.1s	remaining: 462ms
960:	learn: 3107.7588807	total: 11.1s	remaining: 450ms
961:	learn: 3107.5766671	total: 11.1s	remaining: 439ms
962:	learn: 3107.2279480	total: 11.1s	remaining: 427ms
963:	learn: 3106.9842747	total: 11.1s	remaining: 415ms
964:	learn: 3106.5533618	total: 11.1s	remaining: 404ms
965:	learn: 3106.2519095	total: 11.1s	remaining: 392ms
966:	learn: 3105.7356463	total: 11.1s	remaining: 380ms
967:	learn: 3105.5742879	total: 11.2s	remaining: 369ms
968:	learn: 3105.2671264	total: 11.2s	remaining: 357ms
969:	learn: 3104.9678678	total: 11.2s	remaining: 346ms
970:	learn: 3104.7178112	total: 11.2s	remaining: 334ms
971:	learn: 3104.1101229	total: 11.2s	remaining: 322ms
972:	learn: 3103.8674587	total: 11.2s	remaining: 311ms
973:	learn: 3103.6652154	total: 11.2s	remaining: 299ms
974:	learn: 3103.2899889	total: 11.2s	remaining: 288ms
975:	learn: 3102.9625424	total: 11.2s	remaining: 276ms
976:	learn: 3102.6848973	total: 11.2s	remaining: 265ms
977:	learn: 3102.3478721	total: 11.3s	remaining: 253ms
978:	learn: 3101.8304503	total: 11.3s	remaining: 242ms
979:	learn: 3101.4576016	total: 11.3s	remaining: 230ms
980:	learn: 3101.2094029	total: 11.3s	remaining: 218ms
981:	learn: 3101.1585804	total: 11.3s	remaining: 207ms
982:	learn: 3100.7731185	total: 11.3s	remaining: 195ms
983:	learn: 3100.5568476	total: 11.3s	remaining: 184ms
984:	learn: 3099.9494018	total: 11.3s	remaining: 172ms
985:	learn: 3099.6416902	total: 11.3s	remaining: 161ms
986:	learn: 3099.4314714	total: 11.3s	remaining: 149ms
987:	learn: 3099.2520873	total: 11.4s	remaining: 138ms
988:	learn: 3099.0490446	total: 11.4s	remaining: 126ms
989:	learn: 3098.6630314	total: 11.4s	remaining: 115ms
990:	learn: 3098.4238791	total: 11.4s	remaining: 103ms
991:	learn: 3097.8731259	total: 11.4s	remaining: 91.8ms
992:	learn: 3097.5701606	total: 11.4s	remaining: 80.4ms
993:	learn: 3097.0407127	total: 11.4s	remaining: 68.9ms
994:	learn: 3096.8844104	total: 11.4s	remaining: 57.4ms
995:	learn: 3096.7224595	total: 11.4s	remaining: 45.9ms
996:	learn: 3096.5782334	total: 11.4s	remaining: 34.5ms
997:	learn: 3096.2242293	total: 11.5s	remaining: 23ms
998:	learn: 3095.9878835	total: 11.5s	remaining: 11.5ms
999:	learn: 3095.6990756	total: 11.5s	remaining: 0us
0:	learn: 11542.3464079	total: 11.8ms	remaining: 11.7s
1:	learn: 11228.6872587	total: 22.9ms	remaining: 11.4s
2:	learn: 10925.7371415	total: 35.6ms	remaining: 11.8s
3:	learn: 10636.4170452	total: 46.6ms	remaining: 11.6s
4:	learn: 10353.5791870	total: 58.8ms	remaining: 11.7s
5:	learn: 10095.0894756	total: 70.9ms	remaining: 11.7s
6:	learn: 9833.7786171	total: 83.6ms	remaining: 11.9s
7:	learn: 9577.9031975	total: 96.7ms	remaining: 12s
8:	learn: 9330.2939204	total: 109ms	remaining: 12s
9:	learn: 9093.8870450	total: 120ms	remaining: 11.9s
10:	learn: 8869.2239935	total: 134ms	remaining: 12s
11:	learn: 8657.1660196	total: 146ms	remaining: 12s
12:	learn: 8446.0058688	total: 158ms	remaining: 12s
13:	learn: 8251.8641964	total: 170ms	remaining: 12s
14:	learn: 8058.9673897	total: 182ms	remaining: 12s
15:	learn: 7875.0504200	total: 193ms	remaining: 11.9s
16:	learn: 7697.6857422	total: 204ms	remaining: 11.8s
17:	learn: 7538.2567022	total: 221ms	remaining: 12.1s
18:	learn: 7376.9741574	total: 233ms	remaining: 12s
19:	learn: 7217.0881125	total: 243ms	remaining: 11.9s
20:	learn: 7072.8871576	total: 258ms	remaining: 12s
21:	learn: 6922.3060230	total: 269ms	remaining: 11.9s
22:	learn: 6778.8513571	total: 281ms	remaining: 11.9s
23:	learn: 6648.8287071	total: 292ms	remaining: 11.9s
24:	learn: 6526.9784188	total: 303ms	remaining: 11.8s
25:	learn: 6409.4250821	total: 313ms	remaining: 11.7s
26:	learn: 6293.0834375	total: 325ms	remaining: 11.7s
27:	learn: 6177.6346775	total: 336ms	remaining: 11.7s
28:	learn: 6069.7294657	total: 347ms	remaining: 11.6s
29:	learn: 5970.0164333	total: 355ms	remaining: 11.5s
30:	learn: 5883.9626405	total: 366ms	remaining: 11.4s
31:	learn: 5791.5360971	total: 380ms	remaining: 11.5s
32:	learn: 5712.2480032	total: 394ms	remaining: 11.5s
33:	learn: 5632.4011722	total: 408ms	remaining: 11.6s
34:	learn: 5556.2213718	total: 422ms	remaining: 11.6s
35:	learn: 5490.7275576	total: 442ms	remaining: 11.8s
36:	learn: 5417.4896707	total: 454ms	remaining: 11.8s
37:	learn: 5346.6057689	total: 467ms	remaining: 11.8s
38:	learn: 5283.2675059	total: 478ms	remaining: 11.8s
39:	learn: 5217.9173062	total: 490ms	remaining: 11.8s
40:	learn: 5159.5413068	total: 502ms	remaining: 11.7s
41:	learn: 5101.6814484	total: 519ms	remaining: 11.8s
42:	learn: 5049.6051287	total: 531ms	remaining: 11.8s
43:	learn: 4995.9275702	total: 542ms	remaining: 11.8s
44:	learn: 4949.9753633	total: 554ms	remaining: 11.8s
45:	learn: 4904.7714498	total: 566ms	remaining: 11.7s
46:	learn: 4854.2750579	total: 578ms	remaining: 11.7s
47:	learn: 4810.3936315	total: 598ms	remaining: 11.9s
48:	learn: 4768.9629458	total: 612ms	remaining: 11.9s
49:	learn: 4732.9376942	total: 623ms	remaining: 11.8s
50:	learn: 4695.3046935	total: 643ms	remaining: 12s
51:	learn: 4659.6198275	total: 655ms	remaining: 11.9s
52:	learn: 4621.9206247	total: 667ms	remaining: 11.9s
53:	learn: 4588.3951139	total: 681ms	remaining: 11.9s
54:	learn: 4560.0056789	total: 692ms	remaining: 11.9s
55:	learn: 4530.2289042	total: 704ms	remaining: 11.9s
56:	learn: 4502.9306677	total: 716ms	remaining: 11.8s
57:	learn: 4477.7335935	total: 727ms	remaining: 11.8s
58:	learn: 4449.8408368	total: 740ms	remaining: 11.8s
59:	learn: 4423.7709340	total: 752ms	remaining: 11.8s
60:	learn: 4398.3207464	total: 764ms	remaining: 11.8s
61:	learn: 4389.1421767	total: 775ms	remaining: 11.7s
62:	learn: 4367.6161754	total: 787ms	remaining: 11.7s
63:	learn: 4347.4290699	total: 799ms	remaining: 11.7s
64:	learn: 4322.1589423	total: 821ms	remaining: 11.8s
65:	learn: 4296.2291677	total: 845ms	remaining: 12s
66:	learn: 4275.4507689	total: 859ms	remaining: 12s
67:	learn: 4256.9045227	total: 871ms	remaining: 11.9s
68:	learn: 4239.4681625	total: 881ms	remaining: 11.9s
69:	learn: 4223.2837984	total: 891ms	remaining: 11.8s
70:	learn: 4204.1688995	total: 901ms	remaining: 11.8s
71:	learn: 4184.2025344	total: 911ms	remaining: 11.7s
72:	learn: 4168.9680632	total: 922ms	remaining: 11.7s
73:	learn: 4151.1681160	total: 931ms	remaining: 11.7s
74:	learn: 4132.9930308	total: 940ms	remaining: 11.6s
75:	learn: 4118.5530013	total: 950ms	remaining: 11.6s
76:	learn: 4104.3575317	total: 960ms	remaining: 11.5s
77:	learn: 4091.2284337	total: 970ms	remaining: 11.5s
78:	learn: 4079.8093054	total: 981ms	remaining: 11.4s
79:	learn: 4064.8392271	total: 990ms	remaining: 11.4s
80:	learn: 4050.3820830	total: 999ms	remaining: 11.3s
81:	learn: 4036.7463446	total: 1.01s	remaining: 11.3s
82:	learn: 4026.4194752	total: 1.02s	remaining: 11.2s
83:	learn: 4016.5492486	total: 1.03s	remaining: 11.3s
84:	learn: 4007.2822154	total: 1.04s	remaining: 11.2s
85:	learn: 3997.1452810	total: 1.05s	remaining: 11.2s
86:	learn: 3984.0881212	total: 1.06s	remaining: 11.1s
87:	learn: 3973.7039950	total: 1.07s	remaining: 11.1s
88:	learn: 3963.6815382	total: 1.08s	remaining: 11.1s
89:	learn: 3958.1781375	total: 1.09s	remaining: 11s
90:	learn: 3948.8100026	total: 1.1s	remaining: 11s
91:	learn: 3940.7000456	total: 1.11s	remaining: 11s
92:	learn: 3932.0283242	total: 1.12s	remaining: 10.9s
93:	learn: 3927.3843120	total: 1.13s	remaining: 10.9s
94:	learn: 3920.6807383	total: 1.14s	remaining: 10.9s
95:	learn: 3914.4990831	total: 1.15s	remaining: 10.8s
96:	learn: 3910.5936069	total: 1.16s	remaining: 10.8s
97:	learn: 3901.6832163	total: 1.17s	remaining: 10.8s
98:	learn: 3894.8183501	total: 1.18s	remaining: 10.7s
99:	learn: 3889.8439628	total: 1.19s	remaining: 10.7s
100:	learn: 3884.4935266	total: 1.2s	remaining: 10.7s
101:	learn: 3880.2845293	total: 1.21s	remaining: 10.6s
102:	learn: 3875.2642670	total: 1.22s	remaining: 10.6s
103:	learn: 3867.7887964	total: 1.24s	remaining: 10.7s
104:	learn: 3863.1721976	total: 1.25s	remaining: 10.7s
105:	learn: 3859.2458663	total: 1.26s	remaining: 10.6s
106:	learn: 3853.9880374	total: 1.27s	remaining: 10.6s
107:	learn: 3849.7298940	total: 1.28s	remaining: 10.6s
108:	learn: 3844.4509213	total: 1.29s	remaining: 10.5s
109:	learn: 3840.9386523	total: 1.3s	remaining: 10.5s
110:	learn: 3836.7297653	total: 1.3s	remaining: 10.5s
111:	learn: 3832.2200100	total: 1.31s	remaining: 10.4s
112:	learn: 3826.0576610	total: 1.32s	remaining: 10.4s
113:	learn: 3820.3178003	total: 1.33s	remaining: 10.4s
114:	learn: 3816.0720259	total: 1.34s	remaining: 10.4s
115:	learn: 3812.6826099	total: 1.35s	remaining: 10.3s
116:	learn: 3809.2184245	total: 1.36s	remaining: 10.3s
117:	learn: 3805.0957432	total: 1.37s	remaining: 10.3s
118:	learn: 3801.0330255	total: 1.38s	remaining: 10.2s
119:	learn: 3796.7667919	total: 1.39s	remaining: 10.2s
120:	learn: 3792.3137038	total: 1.4s	remaining: 10.2s
121:	learn: 3790.8160529	total: 1.41s	remaining: 10.2s
122:	learn: 3788.0297026	total: 1.43s	remaining: 10.2s
123:	learn: 3783.0526779	total: 1.45s	remaining: 10.2s
124:	learn: 3778.6092942	total: 1.46s	remaining: 10.2s
125:	learn: 3772.9588459	total: 1.47s	remaining: 10.2s
126:	learn: 3769.6333291	total: 1.48s	remaining: 10.1s
127:	learn: 3765.3439213	total: 1.49s	remaining: 10.1s
128:	learn: 3761.6126144	total: 1.5s	remaining: 10.1s
129:	learn: 3757.2456848	total: 1.5s	remaining: 10.1s
130:	learn: 3756.0839741	total: 1.51s	remaining: 10s
131:	learn: 3751.8560411	total: 1.52s	remaining: 10s
132:	learn: 3749.6982520	total: 1.53s	remaining: 9.99s
133:	learn: 3747.2992162	total: 1.54s	remaining: 9.97s
134:	learn: 3744.9054175	total: 1.55s	remaining: 9.95s
135:	learn: 3742.8154150	total: 1.56s	remaining: 9.92s
136:	learn: 3738.3076811	total: 1.57s	remaining: 9.89s
137:	learn: 3734.8280870	total: 1.59s	remaining: 9.95s
138:	learn: 3731.3514114	total: 1.6s	remaining: 9.92s
139:	learn: 3727.9119340	total: 1.61s	remaining: 9.9s
140:	learn: 3726.1702837	total: 1.62s	remaining: 9.88s
141:	learn: 3723.3899306	total: 1.63s	remaining: 9.87s
142:	learn: 3721.2484067	total: 1.65s	remaining: 9.87s
143:	learn: 3718.0927942	total: 1.66s	remaining: 9.85s
144:	learn: 3715.9076774	total: 1.67s	remaining: 9.82s
145:	learn: 3714.2397148	total: 1.68s	remaining: 9.8s
146:	learn: 3713.6085278	total: 1.68s	remaining: 9.78s
147:	learn: 3710.4226587	total: 1.69s	remaining: 9.75s
148:	learn: 3707.5605824	total: 1.7s	remaining: 9.72s
149:	learn: 3704.8184563	total: 1.71s	remaining: 9.7s
150:	learn: 3704.1091573	total: 1.72s	remaining: 9.68s
151:	learn: 3702.1180452	total: 1.73s	remaining: 9.65s
152:	learn: 3701.3565227	total: 1.74s	remaining: 9.64s
153:	learn: 3699.6616985	total: 1.75s	remaining: 9.62s
154:	learn: 3697.8812767	total: 1.76s	remaining: 9.6s
155:	learn: 3695.0753024	total: 1.77s	remaining: 9.58s
156:	learn: 3692.6403320	total: 1.78s	remaining: 9.55s
157:	learn: 3690.2795088	total: 1.79s	remaining: 9.54s
158:	learn: 3687.3561413	total: 1.8s	remaining: 9.51s
159:	learn: 3686.4526091	total: 1.81s	remaining: 9.49s
160:	learn: 3685.3388892	total: 1.82s	remaining: 9.47s
161:	learn: 3684.6686032	total: 1.83s	remaining: 9.45s
162:	learn: 3682.8183717	total: 1.85s	remaining: 9.49s
163:	learn: 3681.5383970	total: 1.86s	remaining: 9.47s
164:	learn: 3679.8508548	total: 1.87s	remaining: 9.45s
165:	learn: 3678.2224962	total: 1.88s	remaining: 9.44s
166:	learn: 3674.7738341	total: 1.89s	remaining: 9.42s
167:	learn: 3672.5442640	total: 1.9s	remaining: 9.41s
168:	learn: 3670.7670246	total: 1.92s	remaining: 9.45s
169:	learn: 3668.0101144	total: 1.93s	remaining: 9.43s
170:	learn: 3666.5997084	total: 1.94s	remaining: 9.41s
171:	learn: 3664.7186067	total: 1.95s	remaining: 9.4s
172:	learn: 3663.7977739	total: 1.96s	remaining: 9.38s
173:	learn: 3663.3699987	total: 1.97s	remaining: 9.37s
174:	learn: 3661.6944597	total: 1.98s	remaining: 9.35s
175:	learn: 3661.5525189	total: 1.99s	remaining: 9.33s
176:	learn: 3658.7086792	total: 2s	remaining: 9.31s
177:	learn: 3657.5011828	total: 2.01s	remaining: 9.29s
178:	learn: 3656.1792070	total: 2.02s	remaining: 9.27s
179:	learn: 3655.0426070	total: 2.03s	remaining: 9.25s
180:	learn: 3652.8234949	total: 2.04s	remaining: 9.23s
181:	learn: 3652.2812222	total: 2.06s	remaining: 9.26s
182:	learn: 3650.6367730	total: 2.07s	remaining: 9.24s
183:	learn: 3648.2575266	total: 2.08s	remaining: 9.22s
184:	learn: 3647.2299358	total: 2.09s	remaining: 9.19s
185:	learn: 3644.5387587	total: 2.1s	remaining: 9.17s
186:	learn: 3641.4739328	total: 2.1s	remaining: 9.15s
187:	learn: 3640.8649678	total: 2.11s	remaining: 9.13s
188:	learn: 3640.0105930	total: 2.12s	remaining: 9.11s
189:	learn: 3638.7470314	total: 2.13s	remaining: 9.09s
190:	learn: 3636.9375663	total: 2.14s	remaining: 9.07s
191:	learn: 3635.9951941	total: 2.15s	remaining: 9.05s
192:	learn: 3633.6339382	total: 2.16s	remaining: 9.03s
193:	learn: 3632.3338848	total: 2.17s	remaining: 9.02s
194:	learn: 3630.9355592	total: 2.18s	remaining: 8.99s
195:	learn: 3630.1664813	total: 2.19s	remaining: 8.97s
196:	learn: 3627.8080418	total: 2.2s	remaining: 8.95s
197:	learn: 3627.2684704	total: 2.21s	remaining: 8.94s
198:	learn: 3624.9004286	total: 2.22s	remaining: 8.93s
199:	learn: 3623.0229797	total: 2.23s	remaining: 8.91s
200:	learn: 3622.5967798	total: 2.23s	remaining: 8.89s
201:	learn: 3621.7625214	total: 2.24s	remaining: 8.86s
202:	learn: 3620.7878736	total: 2.25s	remaining: 8.85s
203:	learn: 3618.5192143	total: 2.27s	remaining: 8.85s
204:	learn: 3617.7732843	total: 2.28s	remaining: 8.83s
205:	learn: 3616.5344295	total: 2.29s	remaining: 8.81s
206:	learn: 3614.7187613	total: 2.3s	remaining: 8.8s
207:	learn: 3612.8760064	total: 2.3s	remaining: 8.78s
208:	learn: 3611.4495303	total: 2.32s	remaining: 8.78s
209:	learn: 3611.2615994	total: 2.33s	remaining: 8.76s
210:	learn: 3609.6731843	total: 2.34s	remaining: 8.74s
211:	learn: 3607.5893865	total: 2.35s	remaining: 8.73s
212:	learn: 3606.6821571	total: 2.36s	remaining: 8.71s
213:	learn: 3604.2975546	total: 2.37s	remaining: 8.7s
214:	learn: 3603.8325685	total: 2.38s	remaining: 8.68s
215:	learn: 3602.3124882	total: 2.38s	remaining: 8.66s
216:	learn: 3600.9836436	total: 2.4s	remaining: 8.64s
217:	learn: 3600.2968161	total: 2.4s	remaining: 8.62s
218:	learn: 3598.7235109	total: 2.42s	remaining: 8.64s
219:	learn: 3598.2889729	total: 2.45s	remaining: 8.69s
220:	learn: 3597.0372508	total: 2.47s	remaining: 8.72s
221:	learn: 3595.1166322	total: 2.49s	remaining: 8.73s
222:	learn: 3593.0684566	total: 2.52s	remaining: 8.77s
223:	learn: 3591.3896253	total: 2.54s	remaining: 8.81s
224:	learn: 3589.4015378	total: 2.57s	remaining: 8.86s
225:	learn: 3588.5516462	total: 2.59s	remaining: 8.88s
226:	learn: 3588.0746504	total: 2.62s	remaining: 8.94s
227:	learn: 3586.6381379	total: 2.65s	remaining: 8.98s
228:	learn: 3586.0681919	total: 2.67s	remaining: 9.01s
229:	learn: 3584.9083651	total: 2.69s	remaining: 9.02s
230:	learn: 3583.4734371	total: 2.71s	remaining: 9.01s
231:	learn: 3581.8858590	total: 2.72s	remaining: 9s
232:	learn: 3580.6428485	total: 2.73s	remaining: 8.99s
233:	learn: 3579.1927135	total: 2.74s	remaining: 8.97s
234:	learn: 3577.7088599	total: 2.75s	remaining: 8.96s
235:	learn: 3576.2833031	total: 2.76s	remaining: 8.95s
236:	learn: 3575.0991514	total: 2.78s	remaining: 8.94s
237:	learn: 3574.0039314	total: 2.79s	remaining: 8.93s
238:	learn: 3572.8555807	total: 2.8s	remaining: 8.92s
239:	learn: 3571.3076687	total: 2.82s	remaining: 8.92s
240:	learn: 3570.2630839	total: 2.83s	remaining: 8.91s
241:	learn: 3568.9860947	total: 2.84s	remaining: 8.9s
242:	learn: 3567.2116930	total: 2.85s	remaining: 8.88s
243:	learn: 3565.8902030	total: 2.86s	remaining: 8.87s
244:	learn: 3564.1758115	total: 2.89s	remaining: 8.9s
245:	learn: 3562.7703498	total: 2.9s	remaining: 8.9s
246:	learn: 3561.8091862	total: 2.92s	remaining: 8.91s
247:	learn: 3561.6257715	total: 2.93s	remaining: 8.9s
248:	learn: 3560.2024257	total: 2.95s	remaining: 8.88s
249:	learn: 3559.4554878	total: 2.96s	remaining: 8.88s
250:	learn: 3558.4665199	total: 2.97s	remaining: 8.86s
251:	learn: 3558.2724921	total: 2.98s	remaining: 8.86s
252:	learn: 3557.9262158	total: 3s	remaining: 8.85s
253:	learn: 3556.6254082	total: 3.01s	remaining: 8.83s
254:	learn: 3555.4277688	total: 3.02s	remaining: 8.82s
255:	learn: 3554.2146175	total: 3.03s	remaining: 8.82s
256:	learn: 3552.7004363	total: 3.05s	remaining: 8.81s
257:	learn: 3551.4344444	total: 3.06s	remaining: 8.8s
258:	learn: 3550.3330415	total: 3.08s	remaining: 8.8s
259:	learn: 3549.5817481	total: 3.1s	remaining: 8.82s
260:	learn: 3547.8373687	total: 3.12s	remaining: 8.84s
261:	learn: 3545.9594310	total: 3.14s	remaining: 8.84s
262:	learn: 3544.6918273	total: 3.15s	remaining: 8.83s
263:	learn: 3543.3096415	total: 3.17s	remaining: 8.84s
264:	learn: 3542.3829918	total: 3.18s	remaining: 8.82s
265:	learn: 3541.1996375	total: 3.19s	remaining: 8.81s
266:	learn: 3539.8219848	total: 3.21s	remaining: 8.8s
267:	learn: 3538.3273998	total: 3.22s	remaining: 8.79s
268:	learn: 3537.9993560	total: 3.25s	remaining: 8.82s
269:	learn: 3537.3449095	total: 3.27s	remaining: 8.85s
270:	learn: 3535.6354797	total: 3.29s	remaining: 8.85s
271:	learn: 3533.9659244	total: 3.32s	remaining: 8.88s
272:	learn: 3532.7265328	total: 3.33s	remaining: 8.87s
273:	learn: 3530.8787967	total: 3.34s	remaining: 8.86s
274:	learn: 3529.9861858	total: 3.35s	remaining: 8.84s
275:	learn: 3529.6906223	total: 3.37s	remaining: 8.83s
276:	learn: 3528.6842063	total: 3.38s	remaining: 8.81s
277:	learn: 3527.4734306	total: 3.39s	remaining: 8.8s
278:	learn: 3526.6497053	total: 3.4s	remaining: 8.79s
279:	learn: 3525.7395838	total: 3.41s	remaining: 8.77s
280:	learn: 3525.3692089	total: 3.42s	remaining: 8.76s
281:	learn: 3524.8579266	total: 3.44s	remaining: 8.75s
282:	learn: 3524.5503215	total: 3.45s	remaining: 8.74s
283:	learn: 3522.7251697	total: 3.47s	remaining: 8.75s
284:	learn: 3522.4207945	total: 3.49s	remaining: 8.75s
285:	learn: 3521.9211696	total: 3.53s	remaining: 8.81s
286:	learn: 3521.7565515	total: 3.55s	remaining: 8.82s
287:	learn: 3520.6290007	total: 3.58s	remaining: 8.86s
288:	learn: 3519.4361029	total: 3.61s	remaining: 8.89s
289:	learn: 3519.0210380	total: 3.64s	remaining: 8.91s
290:	learn: 3518.0266358	total: 3.67s	remaining: 8.93s
291:	learn: 3516.6829006	total: 3.69s	remaining: 8.96s
292:	learn: 3516.4748410	total: 3.73s	remaining: 9s
293:	learn: 3516.0152576	total: 3.76s	remaining: 9.02s
294:	learn: 3514.7736512	total: 3.79s	remaining: 9.05s
295:	learn: 3513.8432948	total: 3.81s	remaining: 9.06s
296:	learn: 3513.3757291	total: 3.83s	remaining: 9.08s
297:	learn: 3512.2103058	total: 3.86s	remaining: 9.1s
298:	learn: 3511.1224795	total: 3.89s	remaining: 9.12s
299:	learn: 3509.2961474	total: 3.92s	remaining: 9.15s
300:	learn: 3508.5962054	total: 3.95s	remaining: 9.18s
301:	learn: 3507.6165749	total: 3.98s	remaining: 9.21s
302:	learn: 3506.5513297	total: 4.01s	remaining: 9.23s
303:	learn: 3505.7149013	total: 4.04s	remaining: 9.25s
304:	learn: 3504.9754081	total: 4.07s	remaining: 9.26s
305:	learn: 3503.3121696	total: 4.08s	remaining: 9.26s
306:	learn: 3503.0115709	total: 4.1s	remaining: 9.25s
307:	learn: 3502.0619943	total: 4.11s	remaining: 9.24s
308:	learn: 3501.1726835	total: 4.12s	remaining: 9.22s
309:	learn: 3500.0654587	total: 4.14s	remaining: 9.21s
310:	learn: 3499.1977773	total: 4.16s	remaining: 9.21s
311:	learn: 3498.3931762	total: 4.18s	remaining: 9.22s
312:	learn: 3497.4727499	total: 4.19s	remaining: 9.21s
313:	learn: 3496.3440982	total: 4.21s	remaining: 9.19s
314:	learn: 3494.7708730	total: 4.22s	remaining: 9.17s
315:	learn: 3494.2791227	total: 4.24s	remaining: 9.17s
316:	learn: 3492.5518500	total: 4.25s	remaining: 9.16s
317:	learn: 3491.9400343	total: 4.27s	remaining: 9.15s
318:	learn: 3490.6536489	total: 4.29s	remaining: 9.16s
319:	learn: 3489.7360657	total: 4.3s	remaining: 9.14s
320:	learn: 3488.5170075	total: 4.32s	remaining: 9.13s
321:	learn: 3487.5796053	total: 4.33s	remaining: 9.11s
322:	learn: 3487.2225623	total: 4.34s	remaining: 9.1s
323:	learn: 3485.4032309	total: 4.36s	remaining: 9.1s
324:	learn: 3485.0922300	total: 4.38s	remaining: 9.09s
325:	learn: 3484.5390095	total: 4.39s	remaining: 9.07s
326:	learn: 3483.2402267	total: 4.4s	remaining: 9.05s
327:	learn: 3482.0189766	total: 4.41s	remaining: 9.04s
328:	learn: 3481.0378294	total: 4.42s	remaining: 9.02s
329:	learn: 3480.9210247	total: 4.43s	remaining: 9s
330:	learn: 3480.4477566	total: 4.45s	remaining: 8.99s
331:	learn: 3479.3550585	total: 4.46s	remaining: 8.98s
332:	learn: 3478.5844900	total: 4.47s	remaining: 8.96s
333:	learn: 3478.0042024	total: 4.49s	remaining: 8.94s
334:	learn: 3477.9234322	total: 4.5s	remaining: 8.93s
335:	learn: 3476.7777280	total: 4.52s	remaining: 8.93s
336:	learn: 3476.1919013	total: 4.53s	remaining: 8.92s
337:	learn: 3474.8886527	total: 4.54s	remaining: 8.9s
338:	learn: 3473.6428232	total: 4.57s	remaining: 8.92s
339:	learn: 3473.1430762	total: 4.59s	remaining: 8.91s
340:	learn: 3471.3826208	total: 4.6s	remaining: 8.9s
341:	learn: 3470.8600162	total: 4.62s	remaining: 8.88s
342:	learn: 3470.2574494	total: 4.63s	remaining: 8.87s
343:	learn: 3469.3350883	total: 4.64s	remaining: 8.85s
344:	learn: 3468.5510382	total: 4.66s	remaining: 8.84s
345:	learn: 3467.4425306	total: 4.67s	remaining: 8.82s
346:	learn: 3467.0455701	total: 4.68s	remaining: 8.8s
347:	learn: 3466.6779488	total: 4.69s	remaining: 8.79s
348:	learn: 3466.0036879	total: 4.7s	remaining: 8.77s
349:	learn: 3465.2325024	total: 4.71s	remaining: 8.75s
350:	learn: 3464.3996020	total: 4.73s	remaining: 8.74s
351:	learn: 3463.5910165	total: 4.74s	remaining: 8.72s
352:	learn: 3462.5851020	total: 4.75s	remaining: 8.71s
353:	learn: 3462.4655024	total: 4.78s	remaining: 8.72s
354:	learn: 3461.0641807	total: 4.79s	remaining: 8.71s
355:	learn: 3460.1533839	total: 4.8s	remaining: 8.69s
356:	learn: 3459.3404094	total: 4.82s	remaining: 8.68s
357:	learn: 3457.9792762	total: 4.83s	remaining: 8.66s
358:	learn: 3457.5933862	total: 4.85s	remaining: 8.65s
359:	learn: 3456.7835527	total: 4.86s	remaining: 8.64s
360:	learn: 3456.2034370	total: 4.87s	remaining: 8.62s
361:	learn: 3456.1307249	total: 4.88s	remaining: 8.6s
362:	learn: 3454.7492460	total: 4.89s	remaining: 8.59s
363:	learn: 3454.0076700	total: 4.91s	remaining: 8.57s
364:	learn: 3452.9601721	total: 4.93s	remaining: 8.58s
365:	learn: 3452.8228336	total: 4.96s	remaining: 8.6s
366:	learn: 3451.7557088	total: 5s	remaining: 8.62s
367:	learn: 3451.1773750	total: 5.03s	remaining: 8.64s
368:	learn: 3450.7376727	total: 5.06s	remaining: 8.65s
369:	learn: 3449.5364822	total: 5.08s	remaining: 8.65s
370:	learn: 3448.1845463	total: 5.11s	remaining: 8.67s
371:	learn: 3446.8679937	total: 5.14s	remaining: 8.68s
372:	learn: 3446.4316121	total: 5.17s	remaining: 8.69s
373:	learn: 3445.2207629	total: 5.19s	remaining: 8.69s
374:	learn: 3444.0735496	total: 5.22s	remaining: 8.7s
375:	learn: 3443.9086590	total: 5.25s	remaining: 8.7s
376:	learn: 3443.0519268	total: 5.26s	remaining: 8.69s
377:	learn: 3442.8967506	total: 5.27s	remaining: 8.68s
378:	learn: 3442.7164354	total: 5.29s	remaining: 8.66s
379:	learn: 3442.2640936	total: 5.29s	remaining: 8.64s
380:	learn: 3441.7710819	total: 5.31s	remaining: 8.62s
381:	learn: 3440.7904697	total: 5.32s	remaining: 8.61s
382:	learn: 3439.3597991	total: 5.33s	remaining: 8.59s
383:	learn: 3438.2182157	total: 5.34s	remaining: 8.57s
384:	learn: 3437.4961911	total: 5.37s	remaining: 8.58s
385:	learn: 3436.5155842	total: 5.4s	remaining: 8.59s
386:	learn: 3435.6022223	total: 5.43s	remaining: 8.6s
387:	learn: 3435.1041032	total: 5.46s	remaining: 8.61s
388:	learn: 3434.0066153	total: 5.49s	remaining: 8.62s
389:	learn: 3433.1039990	total: 5.5s	remaining: 8.61s
390:	learn: 3432.3059597	total: 5.51s	remaining: 8.59s
391:	learn: 3431.8896345	total: 5.52s	remaining: 8.57s
392:	learn: 3431.2091291	total: 5.53s	remaining: 8.54s
393:	learn: 3430.6708813	total: 5.54s	remaining: 8.52s
394:	learn: 3430.0228770	total: 5.55s	remaining: 8.5s
395:	learn: 3429.4362524	total: 5.57s	remaining: 8.49s
396:	learn: 3428.8194870	total: 5.58s	remaining: 8.47s
397:	learn: 3428.1126695	total: 5.59s	remaining: 8.45s
398:	learn: 3427.2695588	total: 5.6s	remaining: 8.43s
399:	learn: 3425.8633258	total: 5.61s	remaining: 8.41s
400:	learn: 3424.8412358	total: 5.62s	remaining: 8.39s
401:	learn: 3424.6100281	total: 5.63s	remaining: 8.37s
402:	learn: 3423.8669661	total: 5.64s	remaining: 8.35s
403:	learn: 3422.9971414	total: 5.65s	remaining: 8.34s
404:	learn: 3422.1773027	total: 5.66s	remaining: 8.32s
405:	learn: 3421.5903343	total: 5.67s	remaining: 8.3s
406:	learn: 3420.9732015	total: 5.68s	remaining: 8.27s
407:	learn: 3420.6151059	total: 5.69s	remaining: 8.25s
408:	learn: 3420.4933466	total: 5.7s	remaining: 8.23s
409:	learn: 3419.3178818	total: 5.71s	remaining: 8.21s
410:	learn: 3418.2693261	total: 5.71s	remaining: 8.19s
411:	learn: 3417.4881798	total: 5.72s	remaining: 8.17s
412:	learn: 3416.2315268	total: 5.73s	remaining: 8.15s
413:	learn: 3415.6857838	total: 5.74s	remaining: 8.12s
414:	learn: 3414.9787887	total: 5.75s	remaining: 8.1s
415:	learn: 3414.0629880	total: 5.76s	remaining: 8.08s
416:	learn: 3413.4516285	total: 5.76s	remaining: 8.06s
417:	learn: 3412.4273359	total: 5.77s	remaining: 8.04s
418:	learn: 3411.4796621	total: 5.78s	remaining: 8.02s
419:	learn: 3410.9037763	total: 5.79s	remaining: 8s
420:	learn: 3410.5243281	total: 5.8s	remaining: 7.98s
421:	learn: 3410.4334092	total: 5.81s	remaining: 7.96s
422:	learn: 3409.3251320	total: 5.83s	remaining: 7.95s
423:	learn: 3408.4640141	total: 5.84s	remaining: 7.93s
424:	learn: 3407.8319040	total: 5.85s	remaining: 7.91s
425:	learn: 3407.1367123	total: 5.85s	remaining: 7.89s
426:	learn: 3406.4966849	total: 5.86s	remaining: 7.87s
427:	learn: 3405.8637835	total: 5.87s	remaining: 7.85s
428:	learn: 3404.9005219	total: 5.88s	remaining: 7.83s
429:	learn: 3403.8977120	total: 5.89s	remaining: 7.81s
430:	learn: 3402.9884190	total: 5.9s	remaining: 7.79s
431:	learn: 3402.5330400	total: 5.91s	remaining: 7.77s
432:	learn: 3401.8555897	total: 5.92s	remaining: 7.75s
433:	learn: 3401.1947712	total: 5.93s	remaining: 7.73s
434:	learn: 3401.1144472	total: 5.94s	remaining: 7.71s
435:	learn: 3400.9129365	total: 5.95s	remaining: 7.69s
436:	learn: 3400.1176755	total: 5.96s	remaining: 7.67s
437:	learn: 3399.7408624	total: 5.97s	remaining: 7.66s
438:	learn: 3398.8384354	total: 5.98s	remaining: 7.64s
439:	learn: 3397.9740425	total: 5.99s	remaining: 7.62s
440:	learn: 3397.1100640	total: 6s	remaining: 7.6s
441:	learn: 3397.0355196	total: 6.01s	remaining: 7.58s
442:	learn: 3395.8636602	total: 6.01s	remaining: 7.56s
443:	learn: 3395.2653188	total: 6.02s	remaining: 7.54s
444:	learn: 3395.1928092	total: 6.04s	remaining: 7.54s
445:	learn: 3394.5211267	total: 6.05s	remaining: 7.52s
446:	learn: 3394.4090718	total: 6.06s	remaining: 7.5s
447:	learn: 3393.7635311	total: 6.07s	remaining: 7.48s
448:	learn: 3392.8653990	total: 6.08s	remaining: 7.46s
449:	learn: 3391.9442024	total: 6.09s	remaining: 7.44s
450:	learn: 3391.8704946	total: 6.1s	remaining: 7.42s
451:	learn: 3390.8260752	total: 6.11s	remaining: 7.4s
452:	learn: 3389.8627511	total: 6.12s	remaining: 7.38s
453:	learn: 3389.5958542	total: 6.12s	remaining: 7.36s
454:	learn: 3388.9462660	total: 6.13s	remaining: 7.35s
455:	learn: 3388.2498222	total: 6.14s	remaining: 7.33s
456:	learn: 3388.1082531	total: 6.15s	remaining: 7.31s
457:	learn: 3388.0449176	total: 6.16s	remaining: 7.29s
458:	learn: 3387.7755891	total: 6.17s	remaining: 7.27s
459:	learn: 3387.0518311	total: 6.18s	remaining: 7.25s
460:	learn: 3386.3828153	total: 6.19s	remaining: 7.24s
461:	learn: 3386.2012379	total: 6.2s	remaining: 7.22s
462:	learn: 3385.4466452	total: 6.21s	remaining: 7.2s
463:	learn: 3385.0670625	total: 6.22s	remaining: 7.18s
464:	learn: 3384.2485652	total: 6.23s	remaining: 7.17s
465:	learn: 3383.2244738	total: 6.25s	remaining: 7.16s
466:	learn: 3382.7850174	total: 6.26s	remaining: 7.14s
467:	learn: 3382.5293860	total: 6.27s	remaining: 7.13s
468:	learn: 3381.6300155	total: 6.28s	remaining: 7.11s
469:	learn: 3380.9596834	total: 6.29s	remaining: 7.09s
470:	learn: 3379.8661472	total: 6.3s	remaining: 7.08s
471:	learn: 3379.1015614	total: 6.31s	remaining: 7.06s
472:	learn: 3378.8585031	total: 6.32s	remaining: 7.05s
473:	learn: 3377.7332639	total: 6.33s	remaining: 7.03s
474:	learn: 3376.4971570	total: 6.34s	remaining: 7.01s
475:	learn: 3376.4432069	total: 6.36s	remaining: 7s
476:	learn: 3375.6309688	total: 6.37s	remaining: 6.98s
477:	learn: 3374.9016916	total: 6.37s	remaining: 6.96s
478:	learn: 3374.3440161	total: 6.38s	remaining: 6.94s
479:	learn: 3373.9455308	total: 6.39s	remaining: 6.93s
480:	learn: 3373.1796824	total: 6.4s	remaining: 6.91s
481:	learn: 3372.3469229	total: 6.41s	remaining: 6.89s
482:	learn: 3371.4578882	total: 6.42s	remaining: 6.87s
483:	learn: 3370.5444543	total: 6.43s	remaining: 6.86s
484:	learn: 3369.6538885	total: 6.46s	remaining: 6.86s
485:	learn: 3368.5316518	total: 6.46s	remaining: 6.84s
486:	learn: 3367.6523554	total: 6.47s	remaining: 6.82s
487:	learn: 3367.3262734	total: 6.48s	remaining: 6.8s
488:	learn: 3366.5926787	total: 6.49s	remaining: 6.78s
489:	learn: 3365.8459692	total: 6.5s	remaining: 6.77s
490:	learn: 3364.9161072	total: 6.51s	remaining: 6.75s
491:	learn: 3364.6586750	total: 6.52s	remaining: 6.73s
492:	learn: 3363.9763186	total: 6.53s	remaining: 6.71s
493:	learn: 3363.3734945	total: 6.55s	remaining: 6.71s
494:	learn: 3362.8703557	total: 6.56s	remaining: 6.69s
495:	learn: 3362.7976864	total: 6.57s	remaining: 6.67s
496:	learn: 3362.0608762	total: 6.58s	remaining: 6.66s
497:	learn: 3361.2025342	total: 6.59s	remaining: 6.64s
498:	learn: 3360.3713607	total: 6.6s	remaining: 6.62s
499:	learn: 3360.0232202	total: 6.6s	remaining: 6.6s
500:	learn: 3359.4282634	total: 6.61s	remaining: 6.59s
501:	learn: 3358.3946694	total: 6.62s	remaining: 6.57s
502:	learn: 3357.5766655	total: 6.63s	remaining: 6.55s
503:	learn: 3357.0173117	total: 6.65s	remaining: 6.54s
504:	learn: 3356.3310680	total: 6.66s	remaining: 6.53s
505:	learn: 3355.6931588	total: 6.67s	remaining: 6.51s
506:	learn: 3355.0065861	total: 6.68s	remaining: 6.49s
507:	learn: 3354.5772793	total: 6.69s	remaining: 6.48s
508:	learn: 3354.2385064	total: 6.7s	remaining: 6.46s
509:	learn: 3353.2762679	total: 6.71s	remaining: 6.44s
510:	learn: 3352.9639888	total: 6.72s	remaining: 6.43s
511:	learn: 3352.7943443	total: 6.73s	remaining: 6.41s
512:	learn: 3352.4881267	total: 6.74s	remaining: 6.4s
513:	learn: 3351.8299348	total: 6.75s	remaining: 6.38s
514:	learn: 3351.1207837	total: 6.76s	remaining: 6.36s
515:	learn: 3349.9644891	total: 6.76s	remaining: 6.35s
516:	learn: 3349.2197565	total: 6.77s	remaining: 6.33s
517:	learn: 3349.1279895	total: 6.79s	remaining: 6.31s
518:	learn: 3349.0428320	total: 6.8s	remaining: 6.3s
519:	learn: 3348.3681739	total: 6.81s	remaining: 6.28s
520:	learn: 3347.5192178	total: 6.82s	remaining: 6.27s
521:	learn: 3347.4396722	total: 6.83s	remaining: 6.25s
522:	learn: 3346.8186790	total: 6.84s	remaining: 6.24s
523:	learn: 3346.3187382	total: 6.86s	remaining: 6.23s
524:	learn: 3346.0465148	total: 6.87s	remaining: 6.21s
525:	learn: 3346.0018015	total: 6.88s	remaining: 6.2s
526:	learn: 3345.5825405	total: 6.89s	remaining: 6.18s
527:	learn: 3345.5283239	total: 6.9s	remaining: 6.16s
528:	learn: 3344.4419625	total: 6.91s	remaining: 6.15s
529:	learn: 3343.5491708	total: 6.91s	remaining: 6.13s
530:	learn: 3343.3154260	total: 6.92s	remaining: 6.12s
531:	learn: 3343.1826655	total: 6.93s	remaining: 6.1s
532:	learn: 3342.4904753	total: 6.94s	remaining: 6.08s
533:	learn: 3342.4160671	total: 6.95s	remaining: 6.07s
534:	learn: 3341.9222586	total: 6.96s	remaining: 6.05s
535:	learn: 3341.4327778	total: 6.97s	remaining: 6.03s
536:	learn: 3340.8191957	total: 6.98s	remaining: 6.02s
537:	learn: 3340.2543626	total: 6.99s	remaining: 6s
538:	learn: 3338.9255959	total: 7s	remaining: 5.99s
539:	learn: 3338.6149845	total: 7.01s	remaining: 5.97s
540:	learn: 3337.9450896	total: 7.02s	remaining: 5.96s
541:	learn: 3337.5845181	total: 7.03s	remaining: 5.94s
542:	learn: 3337.0790847	total: 7.05s	remaining: 5.93s
543:	learn: 3336.0558175	total: 7.06s	remaining: 5.92s
544:	learn: 3335.9895477	total: 7.07s	remaining: 5.9s
545:	learn: 3335.6493311	total: 7.08s	remaining: 5.89s
546:	learn: 3334.9475392	total: 7.09s	remaining: 5.87s
547:	learn: 3334.3858636	total: 7.1s	remaining: 5.85s
548:	learn: 3334.0818592	total: 7.11s	remaining: 5.84s
549:	learn: 3333.2162919	total: 7.12s	remaining: 5.82s
550:	learn: 3332.7731363	total: 7.12s	remaining: 5.8s
551:	learn: 3332.0118683	total: 7.13s	remaining: 5.79s
552:	learn: 3331.1747603	total: 7.14s	remaining: 5.78s
553:	learn: 3331.1270714	total: 7.16s	remaining: 5.76s
554:	learn: 3330.7262410	total: 7.17s	remaining: 5.75s
555:	learn: 3330.0155722	total: 7.17s	remaining: 5.73s
556:	learn: 3329.3962565	total: 7.18s	remaining: 5.71s
557:	learn: 3328.1411888	total: 7.19s	remaining: 5.7s
558:	learn: 3327.3835002	total: 7.2s	remaining: 5.68s
559:	learn: 3326.6436707	total: 7.21s	remaining: 5.67s
560:	learn: 3325.3440729	total: 7.22s	remaining: 5.65s
561:	learn: 3324.3917464	total: 7.23s	remaining: 5.63s
562:	learn: 3323.6124156	total: 7.24s	remaining: 5.62s
563:	learn: 3322.6153961	total: 7.26s	remaining: 5.61s
564:	learn: 3322.2333956	total: 7.28s	remaining: 5.6s
565:	learn: 3322.1709152	total: 7.29s	remaining: 5.59s
566:	learn: 3321.3493223	total: 7.29s	remaining: 5.57s
567:	learn: 3320.5274922	total: 7.3s	remaining: 5.55s
568:	learn: 3320.0213836	total: 7.31s	remaining: 5.54s
569:	learn: 3319.4425235	total: 7.32s	remaining: 5.52s
570:	learn: 3318.6822733	total: 7.33s	remaining: 5.51s
571:	learn: 3317.8926338	total: 7.34s	remaining: 5.5s
572:	learn: 3317.3694336	total: 7.35s	remaining: 5.48s
573:	learn: 3316.4534475	total: 7.36s	remaining: 5.46s
574:	learn: 3316.4113093	total: 7.37s	remaining: 5.45s
575:	learn: 3315.8536499	total: 7.38s	remaining: 5.43s
576:	learn: 3315.3419634	total: 7.39s	remaining: 5.42s
577:	learn: 3315.2826491	total: 7.4s	remaining: 5.4s
578:	learn: 3315.2424858	total: 7.41s	remaining: 5.39s
579:	learn: 3314.9217701	total: 7.42s	remaining: 5.37s
580:	learn: 3314.5252667	total: 7.43s	remaining: 5.36s
581:	learn: 3313.9252774	total: 7.44s	remaining: 5.34s
582:	learn: 3313.8920435	total: 7.45s	remaining: 5.33s
583:	learn: 3313.4827931	total: 7.47s	remaining: 5.32s
584:	learn: 3313.4502542	total: 7.48s	remaining: 5.3s
585:	learn: 3313.4184142	total: 7.49s	remaining: 5.29s
586:	learn: 3312.0832667	total: 7.5s	remaining: 5.27s
587:	learn: 3311.1597969	total: 7.51s	remaining: 5.26s
588:	learn: 3310.6782968	total: 7.52s	remaining: 5.25s
589:	learn: 3309.9646514	total: 7.54s	remaining: 5.24s
590:	learn: 3309.4543198	total: 7.54s	remaining: 5.22s
591:	learn: 3308.9209126	total: 7.55s	remaining: 5.21s
592:	learn: 3308.4283292	total: 7.56s	remaining: 5.19s
593:	learn: 3307.9532333	total: 7.57s	remaining: 5.18s
594:	learn: 3307.2753838	total: 7.58s	remaining: 5.16s
595:	learn: 3306.3984445	total: 7.59s	remaining: 5.15s
596:	learn: 3306.0200859	total: 7.6s	remaining: 5.13s
597:	learn: 3305.3762841	total: 7.61s	remaining: 5.12s
598:	learn: 3305.0340821	total: 7.62s	remaining: 5.1s
599:	learn: 3304.6939071	total: 7.63s	remaining: 5.08s
600:	learn: 3304.0157265	total: 7.63s	remaining: 5.07s
601:	learn: 3303.0746951	total: 7.64s	remaining: 5.05s
602:	learn: 3302.1501921	total: 7.65s	remaining: 5.04s
603:	learn: 3301.9751765	total: 7.66s	remaining: 5.02s
604:	learn: 3301.7485765	total: 7.69s	remaining: 5.02s
605:	learn: 3300.6599051	total: 7.7s	remaining: 5s
606:	learn: 3299.8091025	total: 7.71s	remaining: 4.99s
607:	learn: 3299.4944722	total: 7.71s	remaining: 4.97s
608:	learn: 3298.5488321	total: 7.73s	remaining: 4.96s
609:	learn: 3297.8704359	total: 7.74s	remaining: 4.95s
610:	learn: 3296.9927852	total: 7.74s	remaining: 4.93s
611:	learn: 3296.9718375	total: 7.75s	remaining: 4.92s
612:	learn: 3296.1813288	total: 7.76s	remaining: 4.9s
613:	learn: 3295.8984763	total: 7.77s	remaining: 4.89s
614:	learn: 3294.8996476	total: 7.78s	remaining: 4.87s
615:	learn: 3294.2696890	total: 7.79s	remaining: 4.86s
616:	learn: 3293.8513503	total: 7.8s	remaining: 4.84s
617:	learn: 3293.3812052	total: 7.81s	remaining: 4.83s
618:	learn: 3292.7027619	total: 7.82s	remaining: 4.81s
619:	learn: 3291.6533117	total: 7.83s	remaining: 4.8s
620:	learn: 3290.6979978	total: 7.84s	remaining: 4.78s
621:	learn: 3290.1291743	total: 7.85s	remaining: 4.77s
622:	learn: 3289.4063803	total: 7.86s	remaining: 4.75s
623:	learn: 3288.8012522	total: 7.87s	remaining: 4.74s
624:	learn: 3288.5272472	total: 7.88s	remaining: 4.73s
625:	learn: 3288.0495066	total: 7.89s	remaining: 4.72s
626:	learn: 3287.2332961	total: 7.91s	remaining: 4.7s
627:	learn: 3287.0912735	total: 7.92s	remaining: 4.69s
628:	learn: 3286.5965439	total: 7.92s	remaining: 4.67s
629:	learn: 3286.0866595	total: 7.93s	remaining: 4.66s
630:	learn: 3284.9450961	total: 7.94s	remaining: 4.64s
631:	learn: 3284.2804467	total: 7.95s	remaining: 4.63s
632:	learn: 3283.7140821	total: 7.96s	remaining: 4.62s
633:	learn: 3283.0105513	total: 7.97s	remaining: 4.6s
634:	learn: 3282.0832185	total: 7.98s	remaining: 4.59s
635:	learn: 3281.6512758	total: 7.99s	remaining: 4.57s
636:	learn: 3281.0799329	total: 8s	remaining: 4.56s
637:	learn: 3280.3908328	total: 8.01s	remaining: 4.54s
638:	learn: 3280.1741087	total: 8.02s	remaining: 4.53s
639:	learn: 3279.2401343	total: 8.03s	remaining: 4.51s
640:	learn: 3279.0472239	total: 8.04s	remaining: 4.5s
641:	learn: 3278.6979573	total: 8.04s	remaining: 4.49s
642:	learn: 3278.1800738	total: 8.05s	remaining: 4.47s
643:	learn: 3277.5964627	total: 8.06s	remaining: 4.46s
644:	learn: 3276.8415191	total: 8.08s	remaining: 4.45s
645:	learn: 3275.8283135	total: 8.09s	remaining: 4.43s
646:	learn: 3274.7496322	total: 8.1s	remaining: 4.42s
647:	learn: 3274.1166811	total: 8.11s	remaining: 4.4s
648:	learn: 3273.1676878	total: 8.12s	remaining: 4.39s
649:	learn: 3272.6914544	total: 8.13s	remaining: 4.38s
650:	learn: 3272.1122540	total: 8.14s	remaining: 4.37s
651:	learn: 3271.5785672	total: 8.15s	remaining: 4.35s
652:	learn: 3270.9836015	total: 8.16s	remaining: 4.34s
653:	learn: 3270.4436807	total: 8.17s	remaining: 4.32s
654:	learn: 3269.8570005	total: 8.18s	remaining: 4.31s
655:	learn: 3269.3263661	total: 8.19s	remaining: 4.29s
656:	learn: 3268.7826596	total: 8.2s	remaining: 4.28s
657:	learn: 3268.2015986	total: 8.21s	remaining: 4.27s
658:	learn: 3267.8328523	total: 8.22s	remaining: 4.25s
659:	learn: 3267.3551314	total: 8.23s	remaining: 4.24s
660:	learn: 3266.9952980	total: 8.24s	remaining: 4.22s
661:	learn: 3266.7911748	total: 8.25s	remaining: 4.21s
662:	learn: 3265.9416111	total: 8.26s	remaining: 4.2s
663:	learn: 3265.5557044	total: 8.27s	remaining: 4.18s
664:	learn: 3265.1083809	total: 8.28s	remaining: 4.17s
665:	learn: 3264.6366984	total: 8.3s	remaining: 4.16s
666:	learn: 3264.0938257	total: 8.31s	remaining: 4.15s
667:	learn: 3263.6268897	total: 8.32s	remaining: 4.13s
668:	learn: 3262.9058982	total: 8.33s	remaining: 4.12s
669:	learn: 3262.4889134	total: 8.34s	remaining: 4.11s
670:	learn: 3261.9757655	total: 8.35s	remaining: 4.09s
671:	learn: 3261.6263228	total: 8.35s	remaining: 4.08s
672:	learn: 3261.3089176	total: 8.36s	remaining: 4.06s
673:	learn: 3260.8866000	total: 8.37s	remaining: 4.05s
674:	learn: 3259.7045754	total: 8.38s	remaining: 4.04s
675:	learn: 3259.6585284	total: 8.39s	remaining: 4.02s
676:	learn: 3259.2303390	total: 8.4s	remaining: 4.01s
677:	learn: 3258.5837673	total: 8.41s	remaining: 3.99s
678:	learn: 3258.5499552	total: 8.43s	remaining: 3.98s
679:	learn: 3258.4843162	total: 8.44s	remaining: 3.97s
680:	learn: 3257.6880406	total: 8.45s	remaining: 3.96s
681:	learn: 3257.0738983	total: 8.46s	remaining: 3.94s
682:	learn: 3256.4435048	total: 8.46s	remaining: 3.93s
683:	learn: 3255.7912538	total: 8.47s	remaining: 3.92s
684:	learn: 3255.0637605	total: 8.49s	remaining: 3.9s
685:	learn: 3254.3542431	total: 8.5s	remaining: 3.89s
686:	learn: 3253.7277527	total: 8.52s	remaining: 3.88s
687:	learn: 3253.3328825	total: 8.53s	remaining: 3.87s
688:	learn: 3252.9765505	total: 8.54s	remaining: 3.85s
689:	learn: 3252.4528927	total: 8.55s	remaining: 3.84s
690:	learn: 3251.4089938	total: 8.56s	remaining: 3.83s
691:	learn: 3251.1879991	total: 8.57s	remaining: 3.81s
692:	learn: 3250.4088854	total: 8.58s	remaining: 3.8s
693:	learn: 3250.1683229	total: 8.59s	remaining: 3.79s
694:	learn: 3249.6568311	total: 8.6s	remaining: 3.77s
695:	learn: 3249.2544498	total: 8.6s	remaining: 3.76s
696:	learn: 3249.0304582	total: 8.61s	remaining: 3.74s
697:	learn: 3248.8957977	total: 8.62s	remaining: 3.73s
698:	learn: 3248.2391452	total: 8.63s	remaining: 3.72s
699:	learn: 3247.8748514	total: 8.64s	remaining: 3.7s
700:	learn: 3247.8434975	total: 8.65s	remaining: 3.69s
701:	learn: 3247.5656259	total: 8.66s	remaining: 3.67s
702:	learn: 3247.0677727	total: 8.67s	remaining: 3.66s
703:	learn: 3246.6663683	total: 8.68s	remaining: 3.65s
704:	learn: 3246.1196641	total: 8.7s	remaining: 3.64s
705:	learn: 3245.7712640	total: 8.71s	remaining: 3.63s
706:	learn: 3245.3568553	total: 8.72s	remaining: 3.61s
707:	learn: 3244.9901326	total: 8.73s	remaining: 3.6s
708:	learn: 3244.6620073	total: 8.75s	remaining: 3.59s
709:	learn: 3244.1421880	total: 8.76s	remaining: 3.58s
710:	learn: 3243.5166687	total: 8.77s	remaining: 3.56s
711:	learn: 3243.0624077	total: 8.78s	remaining: 3.55s
712:	learn: 3242.3471075	total: 8.79s	remaining: 3.54s
713:	learn: 3241.9318702	total: 8.79s	remaining: 3.52s
714:	learn: 3241.5077380	total: 8.8s	remaining: 3.51s
715:	learn: 3241.0686112	total: 8.81s	remaining: 3.5s
716:	learn: 3240.7307385	total: 8.82s	remaining: 3.48s
717:	learn: 3239.9151074	total: 8.83s	remaining: 3.47s
718:	learn: 3239.8221575	total: 8.84s	remaining: 3.45s
719:	learn: 3239.4536999	total: 8.85s	remaining: 3.44s
720:	learn: 3239.1186108	total: 8.86s	remaining: 3.43s
721:	learn: 3238.2557437	total: 8.87s	remaining: 3.42s
722:	learn: 3237.7559171	total: 8.88s	remaining: 3.4s
723:	learn: 3237.3541216	total: 8.89s	remaining: 3.39s
724:	learn: 3237.2986054	total: 8.9s	remaining: 3.38s
725:	learn: 3237.2697716	total: 8.91s	remaining: 3.36s
726:	learn: 3236.8693726	total: 8.92s	remaining: 3.35s
727:	learn: 3236.3461765	total: 8.93s	remaining: 3.34s
728:	learn: 3235.7174716	total: 8.94s	remaining: 3.32s
729:	learn: 3235.3221130	total: 8.95s	remaining: 3.31s
730:	learn: 3234.6803202	total: 8.96s	remaining: 3.3s
731:	learn: 3233.8896030	total: 8.97s	remaining: 3.28s
732:	learn: 3232.8823690	total: 8.98s	remaining: 3.27s
733:	learn: 3232.1299224	total: 8.99s	remaining: 3.26s
734:	learn: 3231.7817365	total: 8.99s	remaining: 3.24s
735:	learn: 3231.1957814	total: 9s	remaining: 3.23s
736:	learn: 3230.4079462	total: 9.01s	remaining: 3.22s
737:	learn: 3230.0623992	total: 9.03s	remaining: 3.2s
738:	learn: 3229.5333044	total: 9.04s	remaining: 3.19s
739:	learn: 3229.0884751	total: 9.04s	remaining: 3.18s
740:	learn: 3228.5188207	total: 9.05s	remaining: 3.16s
741:	learn: 3228.1610722	total: 9.06s	remaining: 3.15s
742:	learn: 3227.6882009	total: 9.07s	remaining: 3.14s
743:	learn: 3226.9397717	total: 9.08s	remaining: 3.13s
744:	learn: 3226.1830004	total: 9.09s	remaining: 3.11s
745:	learn: 3225.9494811	total: 9.1s	remaining: 3.1s
746:	learn: 3224.7895098	total: 9.12s	remaining: 3.09s
747:	learn: 3224.3611954	total: 9.13s	remaining: 3.07s
748:	learn: 3223.9177008	total: 9.13s	remaining: 3.06s
749:	learn: 3223.6549388	total: 9.14s	remaining: 3.05s
750:	learn: 3223.6310898	total: 9.15s	remaining: 3.03s
751:	learn: 3222.7842893	total: 9.16s	remaining: 3.02s
752:	learn: 3222.4663187	total: 9.17s	remaining: 3.01s
753:	learn: 3222.0573950	total: 9.18s	remaining: 3s
754:	learn: 3221.6478252	total: 9.19s	remaining: 2.98s
755:	learn: 3220.8748039	total: 9.2s	remaining: 2.97s
756:	learn: 3220.5208633	total: 9.21s	remaining: 2.96s
757:	learn: 3219.8974522	total: 9.21s	remaining: 2.94s
758:	learn: 3219.0255477	total: 9.22s	remaining: 2.93s
759:	learn: 3218.5580790	total: 9.23s	remaining: 2.92s
760:	learn: 3218.1471673	total: 9.24s	remaining: 2.9s
761:	learn: 3217.8853023	total: 9.25s	remaining: 2.89s
762:	learn: 3217.5646327	total: 9.26s	remaining: 2.88s
763:	learn: 3217.1277842	total: 9.27s	remaining: 2.86s
764:	learn: 3216.8771978	total: 9.28s	remaining: 2.85s
765:	learn: 3216.7094501	total: 9.29s	remaining: 2.84s
766:	learn: 3216.3252280	total: 9.29s	remaining: 2.82s
767:	learn: 3216.0024955	total: 9.3s	remaining: 2.81s
768:	learn: 3215.4635538	total: 9.32s	remaining: 2.8s
769:	learn: 3215.4298810	total: 9.33s	remaining: 2.79s
770:	learn: 3214.9309390	total: 9.34s	remaining: 2.77s
771:	learn: 3214.4355207	total: 9.36s	remaining: 2.76s
772:	learn: 3213.6777657	total: 9.36s	remaining: 2.75s
773:	learn: 3213.3974987	total: 9.37s	remaining: 2.74s
774:	learn: 3212.8320990	total: 9.38s	remaining: 2.72s
775:	learn: 3211.9522775	total: 9.39s	remaining: 2.71s
776:	learn: 3211.5838991	total: 9.4s	remaining: 2.7s
777:	learn: 3210.9682489	total: 9.41s	remaining: 2.68s
778:	learn: 3210.6183215	total: 9.42s	remaining: 2.67s
779:	learn: 3209.8946210	total: 9.43s	remaining: 2.66s
780:	learn: 3209.3388249	total: 9.44s	remaining: 2.65s
781:	learn: 3208.9351341	total: 9.44s	remaining: 2.63s
782:	learn: 3208.1897881	total: 9.45s	remaining: 2.62s
783:	learn: 3207.9343229	total: 9.46s	remaining: 2.61s
784:	learn: 3207.8983104	total: 9.47s	remaining: 2.59s
785:	learn: 3207.3006564	total: 9.48s	remaining: 2.58s
786:	learn: 3206.6888061	total: 9.49s	remaining: 2.57s
787:	learn: 3206.3391266	total: 9.52s	remaining: 2.56s
788:	learn: 3205.9559994	total: 9.53s	remaining: 2.55s
789:	learn: 3205.5226953	total: 9.54s	remaining: 2.54s
790:	learn: 3205.4324575	total: 9.55s	remaining: 2.52s
791:	learn: 3204.5806865	total: 9.56s	remaining: 2.51s
792:	learn: 3203.8676876	total: 9.57s	remaining: 2.5s
793:	learn: 3203.3211353	total: 9.58s	remaining: 2.48s
794:	learn: 3202.9260059	total: 9.59s	remaining: 2.47s
795:	learn: 3202.3577802	total: 9.6s	remaining: 2.46s
796:	learn: 3202.3069762	total: 9.61s	remaining: 2.45s
797:	learn: 3202.1491314	total: 9.61s	remaining: 2.43s
798:	learn: 3201.5667814	total: 9.62s	remaining: 2.42s
799:	learn: 3201.2978494	total: 9.63s	remaining: 2.41s
800:	learn: 3200.8976068	total: 9.64s	remaining: 2.4s
801:	learn: 3200.4675103	total: 9.65s	remaining: 2.38s
802:	learn: 3200.0127187	total: 9.66s	remaining: 2.37s
803:	learn: 3199.7091049	total: 9.67s	remaining: 2.36s
804:	learn: 3199.2558811	total: 9.68s	remaining: 2.34s
805:	learn: 3198.9042718	total: 9.69s	remaining: 2.33s
806:	learn: 3198.5465301	total: 9.7s	remaining: 2.32s
807:	learn: 3198.0513205	total: 9.71s	remaining: 2.31s
808:	learn: 3197.8181673	total: 9.72s	remaining: 2.29s
809:	learn: 3197.3813386	total: 9.73s	remaining: 2.28s
810:	learn: 3196.8376587	total: 9.74s	remaining: 2.27s
811:	learn: 3196.2483254	total: 9.75s	remaining: 2.26s
812:	learn: 3196.2301819	total: 9.76s	remaining: 2.25s
813:	learn: 3195.7152070	total: 9.77s	remaining: 2.23s
814:	learn: 3195.2390624	total: 9.78s	remaining: 2.22s
815:	learn: 3194.3697611	total: 9.79s	remaining: 2.21s
816:	learn: 3194.1422394	total: 9.8s	remaining: 2.19s
817:	learn: 3193.8629240	total: 9.81s	remaining: 2.18s
818:	learn: 3193.3940154	total: 9.82s	remaining: 2.17s
819:	learn: 3192.6450682	total: 9.83s	remaining: 2.16s
820:	learn: 3192.3243718	total: 9.84s	remaining: 2.14s
821:	learn: 3191.9956288	total: 9.84s	remaining: 2.13s
822:	learn: 3191.4748061	total: 9.85s	remaining: 2.12s
823:	learn: 3190.9867887	total: 9.87s	remaining: 2.11s
824:	learn: 3190.1768545	total: 9.87s	remaining: 2.09s
825:	learn: 3189.8861594	total: 9.88s	remaining: 2.08s
826:	learn: 3189.3816745	total: 9.89s	remaining: 2.07s
827:	learn: 3188.9074237	total: 9.9s	remaining: 2.06s
828:	learn: 3188.6374315	total: 9.91s	remaining: 2.04s
829:	learn: 3188.1884834	total: 9.92s	remaining: 2.03s
830:	learn: 3187.1676682	total: 9.94s	remaining: 2.02s
831:	learn: 3186.8670503	total: 9.95s	remaining: 2.01s
832:	learn: 3186.5608978	total: 9.96s	remaining: 2s
833:	learn: 3185.9330440	total: 9.97s	remaining: 1.98s
834:	learn: 3185.3764973	total: 9.98s	remaining: 1.97s
835:	learn: 3184.9908037	total: 9.99s	remaining: 1.96s
836:	learn: 3184.6643204	total: 10s	remaining: 1.95s
837:	learn: 3184.5917993	total: 10s	remaining: 1.93s
838:	learn: 3184.1838181	total: 10s	remaining: 1.92s
839:	learn: 3183.8607163	total: 10s	remaining: 1.91s
840:	learn: 3183.5602332	total: 10s	remaining: 1.9s
841:	learn: 3183.3678272	total: 10s	remaining: 1.88s
842:	learn: 3183.0660904	total: 10.1s	remaining: 1.87s
843:	learn: 3182.6506648	total: 10.1s	remaining: 1.86s
844:	learn: 3182.0700770	total: 10.1s	remaining: 1.85s
845:	learn: 3181.4836888	total: 10.1s	remaining: 1.83s
846:	learn: 3181.1862497	total: 10.1s	remaining: 1.82s
847:	learn: 3180.4253880	total: 10.1s	remaining: 1.81s
848:	learn: 3180.1651267	total: 10.1s	remaining: 1.8s
849:	learn: 3180.0439547	total: 10.1s	remaining: 1.78s
850:	learn: 3179.6938276	total: 10.1s	remaining: 1.77s
851:	learn: 3179.4091098	total: 10.1s	remaining: 1.76s
852:	learn: 3179.0296444	total: 10.1s	remaining: 1.75s
853:	learn: 3178.5631143	total: 10.2s	remaining: 1.74s
854:	learn: 3177.9512938	total: 10.2s	remaining: 1.72s
855:	learn: 3177.4343474	total: 10.2s	remaining: 1.71s
856:	learn: 3177.1483147	total: 10.2s	remaining: 1.7s
857:	learn: 3176.6782448	total: 10.2s	remaining: 1.69s
858:	learn: 3176.1107080	total: 10.2s	remaining: 1.68s
859:	learn: 3175.3910392	total: 10.2s	remaining: 1.66s
860:	learn: 3175.2613107	total: 10.2s	remaining: 1.65s
861:	learn: 3174.3841770	total: 10.2s	remaining: 1.64s
862:	learn: 3173.9633315	total: 10.2s	remaining: 1.63s
863:	learn: 3173.7267164	total: 10.3s	remaining: 1.61s
864:	learn: 3173.1558015	total: 10.3s	remaining: 1.6s
865:	learn: 3172.8698301	total: 10.3s	remaining: 1.59s
866:	learn: 3172.7050735	total: 10.3s	remaining: 1.58s
867:	learn: 3172.2724009	total: 10.3s	remaining: 1.56s
868:	learn: 3171.3038536	total: 10.3s	remaining: 1.55s
869:	learn: 3170.5452988	total: 10.3s	remaining: 1.54s
870:	learn: 3170.0969115	total: 10.3s	remaining: 1.53s
871:	learn: 3169.6784046	total: 10.3s	remaining: 1.52s
872:	learn: 3169.1213517	total: 10.3s	remaining: 1.5s
873:	learn: 3168.7358260	total: 10.4s	remaining: 1.49s
874:	learn: 3168.1067945	total: 10.4s	remaining: 1.48s
875:	learn: 3167.9292063	total: 10.4s	remaining: 1.47s
876:	learn: 3167.5120624	total: 10.4s	remaining: 1.46s
877:	learn: 3167.2688449	total: 10.4s	remaining: 1.44s
878:	learn: 3166.9243469	total: 10.4s	remaining: 1.43s
879:	learn: 3166.7435567	total: 10.4s	remaining: 1.42s
880:	learn: 3166.3799002	total: 10.4s	remaining: 1.41s
881:	learn: 3165.9885660	total: 10.4s	remaining: 1.4s
882:	learn: 3165.8159404	total: 10.4s	remaining: 1.38s
883:	learn: 3165.3706300	total: 10.4s	remaining: 1.37s
884:	learn: 3164.9005624	total: 10.5s	remaining: 1.36s
885:	learn: 3164.5836803	total: 10.5s	remaining: 1.35s
886:	learn: 3164.3123691	total: 10.5s	remaining: 1.33s
887:	learn: 3163.7925489	total: 10.5s	remaining: 1.32s
888:	learn: 3163.1673386	total: 10.5s	remaining: 1.31s
889:	learn: 3162.9170948	total: 10.5s	remaining: 1.3s
890:	learn: 3162.6654543	total: 10.5s	remaining: 1.29s
891:	learn: 3162.4581750	total: 10.5s	remaining: 1.27s
892:	learn: 3162.1873723	total: 10.6s	remaining: 1.26s
893:	learn: 3161.7628818	total: 10.6s	remaining: 1.25s
894:	learn: 3161.2249027	total: 10.6s	remaining: 1.24s
895:	learn: 3160.8240701	total: 10.6s	remaining: 1.23s
896:	learn: 3160.7097549	total: 10.6s	remaining: 1.22s
897:	learn: 3160.1709618	total: 10.6s	remaining: 1.2s
898:	learn: 3160.0071273	total: 10.6s	remaining: 1.19s
899:	learn: 3159.5137264	total: 10.6s	remaining: 1.18s
900:	learn: 3159.3978071	total: 10.6s	remaining: 1.17s
901:	learn: 3158.8813298	total: 10.6s	remaining: 1.16s
902:	learn: 3158.8246588	total: 10.6s	remaining: 1.14s
903:	learn: 3158.4405950	total: 10.7s	remaining: 1.13s
904:	learn: 3158.1014729	total: 10.7s	remaining: 1.12s
905:	learn: 3157.6026744	total: 10.7s	remaining: 1.11s
906:	learn: 3157.2680062	total: 10.7s	remaining: 1.09s
907:	learn: 3156.8665775	total: 10.7s	remaining: 1.08s
908:	learn: 3156.5769748	total: 10.7s	remaining: 1.07s
909:	learn: 3156.1434674	total: 10.7s	remaining: 1.06s
910:	learn: 3155.7043947	total: 10.7s	remaining: 1.05s
911:	learn: 3155.1941540	total: 10.7s	remaining: 1.03s
912:	learn: 3155.0588842	total: 10.7s	remaining: 1.02s
913:	learn: 3154.6873683	total: 10.7s	remaining: 1.01s
914:	learn: 3154.2371655	total: 10.8s	remaining: 1s
915:	learn: 3153.5569165	total: 10.8s	remaining: 988ms
916:	learn: 3153.0856102	total: 10.8s	remaining: 976ms
917:	learn: 3152.6288189	total: 10.8s	remaining: 964ms
918:	learn: 3152.3351781	total: 10.8s	remaining: 952ms
919:	learn: 3152.0617771	total: 10.8s	remaining: 940ms
920:	learn: 3151.6887240	total: 10.8s	remaining: 929ms
921:	learn: 3151.2297559	total: 10.8s	remaining: 917ms
922:	learn: 3150.8150953	total: 10.8s	remaining: 905ms
923:	learn: 3150.2105126	total: 10.9s	remaining: 893ms
924:	learn: 3150.0438551	total: 10.9s	remaining: 881ms
925:	learn: 3149.6714624	total: 10.9s	remaining: 869ms
926:	learn: 3149.3904553	total: 10.9s	remaining: 857ms
927:	learn: 3148.9747116	total: 10.9s	remaining: 845ms
928:	learn: 3148.2566793	total: 10.9s	remaining: 833ms
929:	learn: 3147.8713756	total: 10.9s	remaining: 821ms
930:	learn: 3147.5648346	total: 10.9s	remaining: 809ms
931:	learn: 3146.7894758	total: 10.9s	remaining: 797ms
932:	learn: 3146.4369829	total: 10.9s	remaining: 785ms
933:	learn: 3145.9438216	total: 10.9s	remaining: 773ms
934:	learn: 3145.8080291	total: 11s	remaining: 761ms
935:	learn: 3145.6008036	total: 11s	remaining: 749ms
936:	learn: 3145.0900366	total: 11s	remaining: 738ms
937:	learn: 3144.9664716	total: 11s	remaining: 726ms
938:	learn: 3144.4194660	total: 11s	remaining: 714ms
939:	learn: 3144.0105533	total: 11s	remaining: 702ms
940:	learn: 3143.6277282	total: 11s	remaining: 691ms
941:	learn: 3143.0127343	total: 11s	remaining: 679ms
942:	learn: 3142.5975456	total: 11s	remaining: 667ms
943:	learn: 3142.1415251	total: 11s	remaining: 655ms
944:	learn: 3141.7635327	total: 11s	remaining: 643ms
945:	learn: 3141.0600785	total: 11.1s	remaining: 631ms
946:	learn: 3140.8055755	total: 11.1s	remaining: 619ms
947:	learn: 3140.5270168	total: 11.1s	remaining: 608ms
948:	learn: 3139.9298378	total: 11.1s	remaining: 596ms
949:	learn: 3139.7364963	total: 11.1s	remaining: 584ms
950:	learn: 3139.3017733	total: 11.1s	remaining: 572ms
951:	learn: 3138.7890309	total: 11.1s	remaining: 560ms
952:	learn: 3138.5173267	total: 11.1s	remaining: 549ms
953:	learn: 3138.2144184	total: 11.1s	remaining: 537ms
954:	learn: 3137.7126653	total: 11.1s	remaining: 525ms
955:	learn: 3137.3065642	total: 11.1s	remaining: 513ms
956:	learn: 3136.7501341	total: 11.2s	remaining: 501ms
957:	learn: 3136.7210212	total: 11.2s	remaining: 490ms
958:	learn: 3136.1526649	total: 11.2s	remaining: 478ms
959:	learn: 3135.7189356	total: 11.2s	remaining: 467ms
960:	learn: 3135.2232749	total: 11.2s	remaining: 455ms
961:	learn: 3135.0749059	total: 11.2s	remaining: 443ms
962:	learn: 3135.0522729	total: 11.2s	remaining: 431ms
963:	learn: 3134.5574283	total: 11.2s	remaining: 420ms
964:	learn: 3134.2558271	total: 11.2s	remaining: 408ms
965:	learn: 3133.7939611	total: 11.3s	remaining: 396ms
966:	learn: 3133.3735979	total: 11.3s	remaining: 384ms
967:	learn: 3132.8296637	total: 11.3s	remaining: 373ms
968:	learn: 3132.4186301	total: 11.3s	remaining: 361ms
969:	learn: 3131.7616958	total: 11.3s	remaining: 349ms
970:	learn: 3131.4357989	total: 11.3s	remaining: 338ms
971:	learn: 3130.9264913	total: 11.3s	remaining: 326ms
972:	learn: 3130.6593291	total: 11.3s	remaining: 314ms
973:	learn: 3130.2163003	total: 11.3s	remaining: 303ms
974:	learn: 3130.0317436	total: 11.3s	remaining: 291ms
975:	learn: 3129.5367673	total: 11.4s	remaining: 279ms
976:	learn: 3128.7752747	total: 11.4s	remaining: 267ms
977:	learn: 3128.3595870	total: 11.4s	remaining: 256ms
978:	learn: 3127.9561680	total: 11.4s	remaining: 244ms
979:	learn: 3127.6536986	total: 11.4s	remaining: 233ms
980:	learn: 3127.2039700	total: 11.4s	remaining: 221ms
981:	learn: 3126.6044100	total: 11.4s	remaining: 209ms
982:	learn: 3125.9298989	total: 11.4s	remaining: 198ms
983:	learn: 3125.7338147	total: 11.4s	remaining: 186ms
984:	learn: 3125.5397796	total: 11.5s	remaining: 174ms
985:	learn: 3125.3907801	total: 11.5s	remaining: 163ms
986:	learn: 3125.1477227	total: 11.5s	remaining: 151ms
987:	learn: 3124.6743946	total: 11.5s	remaining: 140ms
988:	learn: 3124.2992509	total: 11.5s	remaining: 128ms
989:	learn: 3124.1390318	total: 11.5s	remaining: 116ms
990:	learn: 3123.5228106	total: 11.5s	remaining: 105ms
991:	learn: 3123.3631224	total: 11.5s	remaining: 92.9ms
992:	learn: 3122.8623637	total: 11.5s	remaining: 81.3ms
993:	learn: 3122.2737997	total: 11.5s	remaining: 69.7ms
994:	learn: 3121.6496417	total: 11.6s	remaining: 58.1ms
995:	learn: 3120.7847488	total: 11.6s	remaining: 46.4ms
996:	learn: 3120.4470459	total: 11.6s	remaining: 34.8ms
997:	learn: 3120.1926429	total: 11.6s	remaining: 23.2ms
998:	learn: 3119.7939239	total: 11.6s	remaining: 11.6ms
999:	learn: 3119.1023176	total: 11.6s	remaining: 0us
0:	learn: 11685.3389480	total: 13.6ms	remaining: 13.6s
1:	learn: 11369.3411532	total: 24.7ms	remaining: 12.3s
2:	learn: 11064.3927167	total: 37.1ms	remaining: 12.3s
3:	learn: 10770.1073596	total: 48.2ms	remaining: 12s
4:	learn: 10483.9619042	total: 60.3ms	remaining: 12s
5:	learn: 10221.8532635	total: 72.5ms	remaining: 12s
6:	learn: 9956.7037622	total: 85.2ms	remaining: 12.1s
7:	learn: 9698.4748018	total: 97.6ms	remaining: 12.1s
8:	learn: 9447.2868254	total: 110ms	remaining: 12.1s
9:	learn: 9209.4944227	total: 122ms	remaining: 12s
10:	learn: 8981.8506971	total: 133ms	remaining: 12s
11:	learn: 8767.8139829	total: 149ms	remaining: 12.3s
12:	learn: 8557.4785840	total: 161ms	remaining: 12.2s
13:	learn: 8356.4982532	total: 174ms	remaining: 12.3s
14:	learn: 8162.0797311	total: 186ms	remaining: 12.2s
15:	learn: 7970.4950600	total: 198ms	remaining: 12.1s
16:	learn: 7787.6852580	total: 209ms	remaining: 12.1s
17:	learn: 7617.6009338	total: 235ms	remaining: 12.8s
18:	learn: 7453.4619830	total: 251ms	remaining: 13s
19:	learn: 7291.0701012	total: 262ms	remaining: 12.8s
20:	learn: 7139.2557169	total: 271ms	remaining: 12.7s
21:	learn: 6995.9269657	total: 282ms	remaining: 12.5s
22:	learn: 6854.6794077	total: 293ms	remaining: 12.4s
23:	learn: 6723.8023173	total: 304ms	remaining: 12.4s
24:	learn: 6600.6770870	total: 316ms	remaining: 12.3s
25:	learn: 6482.7420170	total: 328ms	remaining: 12.3s
26:	learn: 6368.4066105	total: 339ms	remaining: 12.2s
27:	learn: 6250.7213194	total: 351ms	remaining: 12.2s
28:	learn: 6141.3090664	total: 362ms	remaining: 12.1s
29:	learn: 6036.8604699	total: 374ms	remaining: 12.1s
30:	learn: 5942.7558518	total: 387ms	remaining: 12.1s
31:	learn: 5847.4544928	total: 398ms	remaining: 12s
32:	learn: 5758.5907262	total: 409ms	remaining: 12s
33:	learn: 5666.7149308	total: 420ms	remaining: 11.9s
34:	learn: 5585.9206997	total: 432ms	remaining: 11.9s
35:	learn: 5508.0535613	total: 459ms	remaining: 12.3s
36:	learn: 5428.3494227	total: 472ms	remaining: 12.3s
37:	learn: 5352.7221052	total: 484ms	remaining: 12.2s
38:	learn: 5281.7766291	total: 495ms	remaining: 12.2s
39:	learn: 5215.1557532	total: 507ms	remaining: 12.2s
40:	learn: 5152.8401770	total: 518ms	remaining: 12.1s
41:	learn: 5094.3819431	total: 534ms	remaining: 12.2s
42:	learn: 5038.3132153	total: 562ms	remaining: 12.5s
43:	learn: 4984.0321741	total: 573ms	remaining: 12.5s
44:	learn: 4929.0462295	total: 587ms	remaining: 12.5s
45:	learn: 4878.7349972	total: 598ms	remaining: 12.4s
46:	learn: 4831.4198486	total: 610ms	remaining: 12.4s
47:	learn: 4785.6548062	total: 622ms	remaining: 12.3s
48:	learn: 4742.8276361	total: 642ms	remaining: 12.5s
49:	learn: 4696.3248669	total: 672ms	remaining: 12.8s
50:	learn: 4652.6597365	total: 688ms	remaining: 12.8s
51:	learn: 4615.8208690	total: 699ms	remaining: 12.7s
52:	learn: 4576.2422552	total: 710ms	remaining: 12.7s
53:	learn: 4542.3575846	total: 722ms	remaining: 12.6s
54:	learn: 4506.9018190	total: 734ms	remaining: 12.6s
55:	learn: 4472.9699623	total: 747ms	remaining: 12.6s
56:	learn: 4442.5733401	total: 758ms	remaining: 12.5s
57:	learn: 4413.8250531	total: 770ms	remaining: 12.5s
58:	learn: 4387.4731908	total: 782ms	remaining: 12.5s
59:	learn: 4360.7269800	total: 795ms	remaining: 12.5s
60:	learn: 4333.1282309	total: 807ms	remaining: 12.4s
61:	learn: 4308.1517246	total: 819ms	remaining: 12.4s
62:	learn: 4286.4628773	total: 830ms	remaining: 12.3s
63:	learn: 4264.2841840	total: 850ms	remaining: 12.4s
64:	learn: 4240.4986861	total: 865ms	remaining: 12.4s
65:	learn: 4219.0852518	total: 876ms	remaining: 12.4s
66:	learn: 4199.4509063	total: 890ms	remaining: 12.4s
67:	learn: 4179.9633400	total: 902ms	remaining: 12.4s
68:	learn: 4162.3593014	total: 914ms	remaining: 12.3s
69:	learn: 4145.5362160	total: 927ms	remaining: 12.3s
70:	learn: 4127.9738297	total: 939ms	remaining: 12.3s
71:	learn: 4112.2950942	total: 951ms	remaining: 12.3s
72:	learn: 4097.4366316	total: 963ms	remaining: 12.2s
73:	learn: 4081.3541910	total: 976ms	remaining: 12.2s
74:	learn: 4066.3483877	total: 987ms	remaining: 12.2s
75:	learn: 4051.4729219	total: 999ms	remaining: 12.1s
76:	learn: 4038.3739210	total: 1.01s	remaining: 12.1s
77:	learn: 4023.7622141	total: 1.02s	remaining: 12.1s
78:	learn: 4012.0471368	total: 1.03s	remaining: 12s
79:	learn: 4001.0235302	total: 1.04s	remaining: 12s
80:	learn: 3988.8921453	total: 1.06s	remaining: 12.1s
81:	learn: 3978.5075298	total: 1.08s	remaining: 12.1s
82:	learn: 3967.3318333	total: 1.09s	remaining: 12s
83:	learn: 3958.6593398	total: 1.1s	remaining: 12s
84:	learn: 3949.5537188	total: 1.11s	remaining: 11.9s
85:	learn: 3941.0383355	total: 1.12s	remaining: 11.9s
86:	learn: 3931.6705307	total: 1.13s	remaining: 11.9s
87:	learn: 3923.8221620	total: 1.14s	remaining: 11.9s
88:	learn: 3914.8852935	total: 1.16s	remaining: 11.8s
89:	learn: 3907.8971187	total: 1.17s	remaining: 11.8s
90:	learn: 3901.2049820	total: 1.18s	remaining: 11.8s
91:	learn: 3892.8503933	total: 1.19s	remaining: 11.8s
92:	learn: 3884.8839426	total: 1.21s	remaining: 11.8s
93:	learn: 3878.6737980	total: 1.22s	remaining: 11.7s
94:	learn: 3871.9246729	total: 1.23s	remaining: 11.7s
95:	learn: 3864.5528054	total: 1.24s	remaining: 11.7s
96:	learn: 3859.0025665	total: 1.25s	remaining: 11.6s
97:	learn: 3852.2451252	total: 1.26s	remaining: 11.6s
98:	learn: 3845.1793538	total: 1.28s	remaining: 11.6s
99:	learn: 3838.9925282	total: 1.29s	remaining: 11.6s
100:	learn: 3832.7948046	total: 1.3s	remaining: 11.6s
101:	learn: 3827.1357940	total: 1.31s	remaining: 11.5s
102:	learn: 3820.8788517	total: 1.32s	remaining: 11.5s
103:	learn: 3814.7674358	total: 1.33s	remaining: 11.5s
104:	learn: 3809.1050026	total: 1.35s	remaining: 11.5s
105:	learn: 3803.9705204	total: 1.36s	remaining: 11.5s
106:	learn: 3798.7102867	total: 1.37s	remaining: 11.4s
107:	learn: 3794.1037959	total: 1.38s	remaining: 11.4s
108:	learn: 3789.1094633	total: 1.39s	remaining: 11.4s
109:	learn: 3784.2509164	total: 1.4s	remaining: 11.4s
110:	learn: 3779.5552095	total: 1.42s	remaining: 11.3s
111:	learn: 3776.1803953	total: 1.43s	remaining: 11.3s
112:	learn: 3771.4988605	total: 1.44s	remaining: 11.3s
113:	learn: 3767.3245840	total: 1.45s	remaining: 11.3s
114:	learn: 3763.2357832	total: 1.46s	remaining: 11.2s
115:	learn: 3760.5457123	total: 1.48s	remaining: 11.3s
116:	learn: 3756.5056468	total: 1.49s	remaining: 11.3s
117:	learn: 3752.3920272	total: 1.5s	remaining: 11.2s
118:	learn: 3748.5799688	total: 1.52s	remaining: 11.2s
119:	learn: 3744.2656295	total: 1.53s	remaining: 11.2s
120:	learn: 3740.6117777	total: 1.54s	remaining: 11.2s
121:	learn: 3736.9637779	total: 1.55s	remaining: 11.2s
122:	learn: 3733.4938636	total: 1.56s	remaining: 11.1s
123:	learn: 3730.3697220	total: 1.57s	remaining: 11.1s
124:	learn: 3727.8779333	total: 1.58s	remaining: 11.1s
125:	learn: 3724.5441990	total: 1.59s	remaining: 11.1s
126:	learn: 3721.5797496	total: 1.6s	remaining: 11s
127:	learn: 3718.7455893	total: 1.61s	remaining: 11s
128:	learn: 3715.4189560	total: 1.63s	remaining: 11s
129:	learn: 3712.5881020	total: 1.64s	remaining: 11s
130:	learn: 3709.5524392	total: 1.66s	remaining: 11s
131:	learn: 3706.5207665	total: 1.67s	remaining: 11s
132:	learn: 3703.3999762	total: 1.69s	remaining: 11s
133:	learn: 3700.8006376	total: 1.7s	remaining: 11s
134:	learn: 3698.1448165	total: 1.71s	remaining: 11s
135:	learn: 3695.1571086	total: 1.73s	remaining: 11s
136:	learn: 3692.6271117	total: 1.74s	remaining: 10.9s
137:	learn: 3689.9883911	total: 1.75s	remaining: 10.9s
138:	learn: 3687.2078720	total: 1.76s	remaining: 10.9s
139:	learn: 3684.7609500	total: 1.77s	remaining: 10.9s
140:	learn: 3682.0871505	total: 1.78s	remaining: 10.9s
141:	learn: 3679.1511502	total: 1.8s	remaining: 10.9s
142:	learn: 3677.5388482	total: 1.81s	remaining: 10.8s
143:	learn: 3676.2557236	total: 1.82s	remaining: 10.8s
144:	learn: 3674.0319552	total: 1.83s	remaining: 10.8s
145:	learn: 3671.6761165	total: 1.84s	remaining: 10.8s
146:	learn: 3669.6971239	total: 1.85s	remaining: 10.8s
147:	learn: 3667.1764847	total: 1.86s	remaining: 10.7s
148:	learn: 3664.8349565	total: 1.88s	remaining: 10.7s
149:	learn: 3662.5828818	total: 1.89s	remaining: 10.7s
150:	learn: 3660.4510406	total: 1.91s	remaining: 10.7s
151:	learn: 3659.0155711	total: 1.92s	remaining: 10.7s
152:	learn: 3657.4520123	total: 1.93s	remaining: 10.7s
153:	learn: 3655.5913713	total: 1.94s	remaining: 10.7s
154:	learn: 3653.6648271	total: 1.96s	remaining: 10.7s
155:	learn: 3651.2228963	total: 1.97s	remaining: 10.6s
156:	learn: 3649.6544017	total: 1.98s	remaining: 10.6s
157:	learn: 3647.5447023	total: 1.99s	remaining: 10.6s
158:	learn: 3646.7697883	total: 2s	remaining: 10.6s
159:	learn: 3645.1271396	total: 2.01s	remaining: 10.5s
160:	learn: 3643.6848364	total: 2.02s	remaining: 10.5s
161:	learn: 3642.0188299	total: 2.03s	remaining: 10.5s
162:	learn: 3639.7366868	total: 2.04s	remaining: 10.5s
163:	learn: 3638.0038726	total: 2.05s	remaining: 10.4s
164:	learn: 3635.8137733	total: 2.06s	remaining: 10.4s
165:	learn: 3634.7892834	total: 2.07s	remaining: 10.4s
166:	learn: 3632.8753668	total: 2.08s	remaining: 10.4s
167:	learn: 3630.6841735	total: 2.1s	remaining: 10.4s
168:	learn: 3629.1542301	total: 2.11s	remaining: 10.4s
169:	learn: 3627.7776899	total: 2.12s	remaining: 10.3s
170:	learn: 3625.6948979	total: 2.13s	remaining: 10.3s
171:	learn: 3624.7629068	total: 2.13s	remaining: 10.3s
172:	learn: 3623.1806802	total: 2.15s	remaining: 10.3s
173:	learn: 3621.3372471	total: 2.15s	remaining: 10.2s
174:	learn: 3619.5611561	total: 2.16s	remaining: 10.2s
175:	learn: 3617.5266872	total: 2.17s	remaining: 10.2s
176:	learn: 3615.8746306	total: 2.18s	remaining: 10.1s
177:	learn: 3613.8336507	total: 2.19s	remaining: 10.1s
178:	learn: 3612.4899835	total: 2.2s	remaining: 10.1s
179:	learn: 3611.2308414	total: 2.21s	remaining: 10.1s
180:	learn: 3609.8740444	total: 2.22s	remaining: 10s
181:	learn: 3608.8378738	total: 2.23s	remaining: 10s
182:	learn: 3607.1281171	total: 2.23s	remaining: 9.97s
183:	learn: 3605.6518636	total: 2.24s	remaining: 9.96s
184:	learn: 3604.7544535	total: 2.26s	remaining: 9.94s
185:	learn: 3602.7322112	total: 2.27s	remaining: 9.92s
186:	learn: 3601.3284532	total: 2.27s	remaining: 9.89s
187:	learn: 3600.2747922	total: 2.28s	remaining: 9.86s
188:	learn: 3599.0594381	total: 2.3s	remaining: 9.87s
189:	learn: 3597.1689910	total: 2.31s	remaining: 9.85s
190:	learn: 3595.8163617	total: 2.32s	remaining: 9.82s
191:	learn: 3594.8021964	total: 2.33s	remaining: 9.8s
192:	learn: 3593.6903023	total: 2.34s	remaining: 9.77s
193:	learn: 3592.3037396	total: 2.35s	remaining: 9.75s
194:	learn: 3590.9019465	total: 2.35s	remaining: 9.72s
195:	learn: 3589.8501883	total: 2.36s	remaining: 9.7s
196:	learn: 3588.5682224	total: 2.37s	remaining: 9.67s
197:	learn: 3587.2776642	total: 2.38s	remaining: 9.65s
198:	learn: 3586.6557032	total: 2.39s	remaining: 9.62s
199:	learn: 3585.5320015	total: 2.4s	remaining: 9.59s
200:	learn: 3584.3313326	total: 2.41s	remaining: 9.57s
201:	learn: 3583.4367850	total: 2.42s	remaining: 9.54s
202:	learn: 3582.2049645	total: 2.42s	remaining: 9.52s
203:	learn: 3581.2734176	total: 2.43s	remaining: 9.49s
204:	learn: 3580.4123997	total: 2.44s	remaining: 9.47s
205:	learn: 3579.3030532	total: 2.45s	remaining: 9.44s
206:	learn: 3578.0502688	total: 2.46s	remaining: 9.42s
207:	learn: 3577.1017768	total: 2.47s	remaining: 9.4s
208:	learn: 3575.7187330	total: 2.48s	remaining: 9.38s
209:	learn: 3574.8495598	total: 2.49s	remaining: 9.35s
210:	learn: 3573.4529280	total: 2.49s	remaining: 9.33s
211:	learn: 3571.6985413	total: 2.51s	remaining: 9.34s
212:	learn: 3570.6675962	total: 2.52s	remaining: 9.33s
213:	learn: 3568.9873589	total: 2.54s	remaining: 9.32s
214:	learn: 3567.6860362	total: 2.55s	remaining: 9.3s
215:	learn: 3566.0591413	total: 2.56s	remaining: 9.28s
216:	learn: 3563.9119471	total: 2.56s	remaining: 9.25s
217:	learn: 3562.4544306	total: 2.57s	remaining: 9.23s
218:	learn: 3561.6030866	total: 2.58s	remaining: 9.21s
219:	learn: 3560.4997395	total: 2.59s	remaining: 9.19s
220:	learn: 3559.1171462	total: 2.6s	remaining: 9.17s
221:	learn: 3558.3609143	total: 2.61s	remaining: 9.15s
222:	learn: 3557.0102012	total: 2.62s	remaining: 9.13s
223:	learn: 3555.6592960	total: 2.63s	remaining: 9.11s
224:	learn: 3554.6287891	total: 2.65s	remaining: 9.12s
225:	learn: 3552.9348127	total: 2.66s	remaining: 9.1s
226:	learn: 3552.1609082	total: 2.67s	remaining: 9.08s
227:	learn: 3550.9516113	total: 2.67s	remaining: 9.06s
228:	learn: 3550.0539000	total: 2.68s	remaining: 9.04s
229:	learn: 3549.0530182	total: 2.69s	remaining: 9.02s
230:	learn: 3548.0734491	total: 2.7s	remaining: 8.99s
231:	learn: 3547.3999114	total: 2.71s	remaining: 8.99s
232:	learn: 3546.2005371	total: 2.73s	remaining: 9s
233:	learn: 3544.9521304	total: 2.74s	remaining: 8.98s
234:	learn: 3544.0167455	total: 2.75s	remaining: 8.96s
235:	learn: 3543.1432048	total: 2.76s	remaining: 8.94s
236:	learn: 3542.3161063	total: 2.77s	remaining: 8.92s
237:	learn: 3541.1506239	total: 2.78s	remaining: 8.9s
238:	learn: 3539.8014216	total: 2.79s	remaining: 8.88s
239:	learn: 3538.4036343	total: 2.8s	remaining: 8.86s
240:	learn: 3537.8464932	total: 2.81s	remaining: 8.83s
241:	learn: 3536.4494962	total: 2.81s	remaining: 8.81s
242:	learn: 3535.7660303	total: 2.83s	remaining: 8.81s
243:	learn: 3534.7038445	total: 2.84s	remaining: 8.79s
244:	learn: 3534.1410247	total: 2.84s	remaining: 8.77s
245:	learn: 3532.9782948	total: 2.85s	remaining: 8.75s
246:	learn: 3531.7896417	total: 2.86s	remaining: 8.73s
247:	learn: 3531.0077219	total: 2.87s	remaining: 8.71s
248:	learn: 3530.0587625	total: 2.88s	remaining: 8.69s
249:	learn: 3529.1104686	total: 2.89s	remaining: 8.67s
250:	learn: 3527.6977916	total: 2.9s	remaining: 8.65s
251:	learn: 3526.6327743	total: 2.91s	remaining: 8.63s
252:	learn: 3525.9538704	total: 2.92s	remaining: 8.63s
253:	learn: 3525.1081844	total: 2.94s	remaining: 8.62s
254:	learn: 3524.1523168	total: 2.94s	remaining: 8.6s
255:	learn: 3523.0489748	total: 2.95s	remaining: 8.58s
256:	learn: 3522.4085350	total: 2.96s	remaining: 8.56s
257:	learn: 3521.9968127	total: 2.97s	remaining: 8.54s
258:	learn: 3520.8994876	total: 2.98s	remaining: 8.52s
259:	learn: 3519.9995417	total: 2.99s	remaining: 8.5s
260:	learn: 3518.9916263	total: 3s	remaining: 8.48s
261:	learn: 3518.3511702	total: 3s	remaining: 8.46s
262:	learn: 3517.1036050	total: 3.01s	remaining: 8.45s
263:	learn: 3515.9619055	total: 3.02s	remaining: 8.43s
264:	learn: 3514.6559079	total: 3.03s	remaining: 8.41s
265:	learn: 3514.1826418	total: 3.04s	remaining: 8.39s
266:	learn: 3512.9466198	total: 3.05s	remaining: 8.38s
267:	learn: 3511.4435729	total: 3.06s	remaining: 8.36s
268:	learn: 3510.6723674	total: 3.07s	remaining: 8.34s
269:	learn: 3509.9540415	total: 3.08s	remaining: 8.32s
270:	learn: 3508.6675849	total: 3.09s	remaining: 8.31s
271:	learn: 3507.9431624	total: 3.1s	remaining: 8.29s
272:	learn: 3507.2761739	total: 3.11s	remaining: 8.27s
273:	learn: 3506.0827325	total: 3.12s	remaining: 8.26s
274:	learn: 3505.2410346	total: 3.15s	remaining: 8.3s
275:	learn: 3504.6546302	total: 3.16s	remaining: 8.29s
276:	learn: 3504.1762446	total: 3.17s	remaining: 8.27s
277:	learn: 3503.3829241	total: 3.18s	remaining: 8.25s
278:	learn: 3501.9681777	total: 3.19s	remaining: 8.23s
279:	learn: 3501.2604613	total: 3.19s	remaining: 8.21s
280:	learn: 3500.8226814	total: 3.2s	remaining: 8.19s
281:	learn: 3500.2037720	total: 3.21s	remaining: 8.18s
282:	learn: 3499.1324829	total: 3.22s	remaining: 8.16s
283:	learn: 3497.8527085	total: 3.23s	remaining: 8.14s
284:	learn: 3497.2944285	total: 3.24s	remaining: 8.12s
285:	learn: 3496.1716541	total: 3.25s	remaining: 8.1s
286:	learn: 3495.4687961	total: 3.25s	remaining: 8.08s
287:	learn: 3494.9548130	total: 3.26s	remaining: 8.07s
288:	learn: 3494.5146921	total: 3.27s	remaining: 8.05s
289:	learn: 3493.9652037	total: 3.28s	remaining: 8.03s
290:	learn: 3492.6998546	total: 3.29s	remaining: 8.02s
291:	learn: 3491.5051144	total: 3.3s	remaining: 8s
292:	learn: 3490.9797742	total: 3.31s	remaining: 7.98s
293:	learn: 3490.6680608	total: 3.31s	remaining: 7.96s
294:	learn: 3489.9863261	total: 3.32s	remaining: 7.94s
295:	learn: 3489.2734922	total: 3.33s	remaining: 7.92s
296:	learn: 3488.3849594	total: 3.34s	remaining: 7.91s
297:	learn: 3487.3090230	total: 3.36s	remaining: 7.91s
298:	learn: 3486.5970488	total: 3.38s	remaining: 7.93s
299:	learn: 3485.9956878	total: 3.4s	remaining: 7.94s
300:	learn: 3484.6109609	total: 3.42s	remaining: 7.93s
301:	learn: 3483.9927241	total: 3.43s	remaining: 7.93s
302:	learn: 3483.3243438	total: 3.44s	remaining: 7.92s
303:	learn: 3482.5586235	total: 3.45s	remaining: 7.91s
304:	learn: 3481.4590863	total: 3.47s	remaining: 7.9s
305:	learn: 3480.5872637	total: 3.48s	remaining: 7.89s
306:	learn: 3480.0623618	total: 3.49s	remaining: 7.88s
307:	learn: 3479.3951190	total: 3.51s	remaining: 7.89s
308:	learn: 3478.3077734	total: 3.54s	remaining: 7.92s
309:	learn: 3478.0795473	total: 3.56s	remaining: 7.93s
310:	learn: 3477.3933598	total: 3.58s	remaining: 7.93s
311:	learn: 3477.0000003	total: 3.59s	remaining: 7.92s
312:	learn: 3476.2963668	total: 3.6s	remaining: 7.91s
313:	learn: 3475.6999952	total: 3.62s	remaining: 7.91s
314:	learn: 3474.9113868	total: 3.64s	remaining: 7.91s
315:	learn: 3474.2542234	total: 3.67s	remaining: 7.94s
316:	learn: 3473.4965495	total: 3.68s	remaining: 7.94s
317:	learn: 3472.7144214	total: 3.7s	remaining: 7.93s
318:	learn: 3471.2597039	total: 3.71s	remaining: 7.92s
319:	learn: 3470.3395046	total: 3.74s	remaining: 7.94s
320:	learn: 3469.2460597	total: 3.77s	remaining: 7.98s
321:	learn: 3468.6232773	total: 3.81s	remaining: 8.02s
322:	learn: 3467.5299332	total: 3.84s	remaining: 8.04s
323:	learn: 3467.2321214	total: 3.86s	remaining: 8.06s
324:	learn: 3466.0451462	total: 3.89s	remaining: 8.08s
325:	learn: 3465.9081652	total: 3.91s	remaining: 8.09s
326:	learn: 3465.1928554	total: 3.94s	remaining: 8.12s
327:	learn: 3464.1727490	total: 3.96s	remaining: 8.11s
328:	learn: 3463.5295060	total: 3.97s	remaining: 8.09s
329:	learn: 3462.7083588	total: 3.99s	remaining: 8.11s
330:	learn: 3461.5729559	total: 4.01s	remaining: 8.11s
331:	learn: 3460.7634072	total: 4.03s	remaining: 8.11s
332:	learn: 3459.8372892	total: 4.04s	remaining: 8.1s
333:	learn: 3459.1417179	total: 4.05s	remaining: 8.09s
334:	learn: 3458.3225192	total: 4.07s	remaining: 8.07s
335:	learn: 3457.8991761	total: 4.08s	remaining: 8.06s
336:	learn: 3457.0463228	total: 4.09s	remaining: 8.05s
337:	learn: 3456.3843937	total: 4.1s	remaining: 8.04s
338:	learn: 3454.9896878	total: 4.11s	remaining: 8.02s
339:	learn: 3454.4221944	total: 4.13s	remaining: 8.01s
340:	learn: 3452.7884738	total: 4.15s	remaining: 8.02s
341:	learn: 3452.3827585	total: 4.18s	remaining: 8.04s
342:	learn: 3452.0228071	total: 4.23s	remaining: 8.1s
343:	learn: 3451.1173498	total: 4.25s	remaining: 8.11s
344:	learn: 3450.3443487	total: 4.28s	remaining: 8.12s
345:	learn: 3449.5027636	total: 4.3s	remaining: 8.13s
346:	learn: 3448.8861566	total: 4.31s	remaining: 8.12s
347:	learn: 3448.1286786	total: 4.34s	remaining: 8.13s
348:	learn: 3447.3919921	total: 4.37s	remaining: 8.14s
349:	learn: 3446.7431927	total: 4.39s	remaining: 8.15s
350:	learn: 3445.7439572	total: 4.41s	remaining: 8.15s
351:	learn: 3444.9318694	total: 4.42s	remaining: 8.14s
352:	learn: 3444.2427363	total: 4.44s	remaining: 8.13s
353:	learn: 3443.3022384	total: 4.45s	remaining: 8.12s
354:	learn: 3442.0477762	total: 4.46s	remaining: 8.11s
355:	learn: 3441.4835301	total: 4.48s	remaining: 8.1s
356:	learn: 3440.6156216	total: 4.49s	remaining: 8.09s
357:	learn: 3439.9741561	total: 4.5s	remaining: 8.07s
358:	learn: 3438.7584219	total: 4.51s	remaining: 8.06s
359:	learn: 3438.3108823	total: 4.53s	remaining: 8.04s
360:	learn: 3437.7298442	total: 4.55s	remaining: 8.05s
361:	learn: 3437.1243949	total: 4.57s	remaining: 8.06s
362:	learn: 3436.4467457	total: 4.59s	remaining: 8.06s
363:	learn: 3435.9014356	total: 4.64s	remaining: 8.12s
364:	learn: 3435.3698083	total: 4.68s	remaining: 8.15s
365:	learn: 3434.5649941	total: 4.71s	remaining: 8.16s
366:	learn: 3433.6263523	total: 4.74s	remaining: 8.17s
367:	learn: 3432.7021390	total: 4.76s	remaining: 8.18s
368:	learn: 3432.0698689	total: 4.79s	remaining: 8.2s
369:	learn: 3431.1957504	total: 4.82s	remaining: 8.21s
370:	learn: 3430.2235826	total: 4.85s	remaining: 8.22s
371:	learn: 3429.5274399	total: 4.88s	remaining: 8.23s
372:	learn: 3428.8872160	total: 4.91s	remaining: 8.25s
373:	learn: 3428.1325193	total: 4.93s	remaining: 8.26s
374:	learn: 3427.0825059	total: 4.96s	remaining: 8.26s
375:	learn: 3425.9400033	total: 4.99s	remaining: 8.28s
376:	learn: 3425.2663530	total: 5.02s	remaining: 8.29s
377:	learn: 3424.7002675	total: 5.04s	remaining: 8.3s
378:	learn: 3423.9625511	total: 5.08s	remaining: 8.32s
379:	learn: 3422.8030768	total: 5.1s	remaining: 8.32s
380:	learn: 3422.1585894	total: 5.13s	remaining: 8.34s
381:	learn: 3421.3915269	total: 5.16s	remaining: 8.35s
382:	learn: 3420.9865314	total: 5.18s	remaining: 8.35s
383:	learn: 3420.4169179	total: 5.21s	remaining: 8.36s
384:	learn: 3419.7120177	total: 5.24s	remaining: 8.37s
385:	learn: 3419.0923519	total: 5.26s	remaining: 8.37s
386:	learn: 3418.3709353	total: 5.3s	remaining: 8.4s
387:	learn: 3417.4983499	total: 5.33s	remaining: 8.4s
388:	learn: 3416.5593383	total: 5.36s	remaining: 8.41s
389:	learn: 3415.2516981	total: 5.38s	remaining: 8.41s
390:	learn: 3414.3466109	total: 5.39s	remaining: 8.39s
391:	learn: 3413.9701190	total: 5.4s	remaining: 8.38s
392:	learn: 3413.2990193	total: 5.41s	remaining: 8.36s
393:	learn: 3412.6078132	total: 5.43s	remaining: 8.35s
394:	learn: 3411.7018104	total: 5.45s	remaining: 8.34s
395:	learn: 3410.5699270	total: 5.48s	remaining: 8.35s
396:	learn: 3410.1524608	total: 5.51s	remaining: 8.37s
397:	learn: 3409.4367953	total: 5.53s	remaining: 8.37s
398:	learn: 3409.0847208	total: 5.56s	remaining: 8.38s
399:	learn: 3408.5446935	total: 5.58s	remaining: 8.37s
400:	learn: 3407.8025864	total: 5.61s	remaining: 8.37s
401:	learn: 3406.8104298	total: 5.62s	remaining: 8.36s
402:	learn: 3406.0361165	total: 5.63s	remaining: 8.35s
403:	learn: 3405.1086650	total: 5.65s	remaining: 8.33s
404:	learn: 3404.1403463	total: 5.66s	remaining: 8.32s
405:	learn: 3403.2958967	total: 5.68s	remaining: 8.31s
406:	learn: 3402.7224194	total: 5.69s	remaining: 8.29s
407:	learn: 3401.9217438	total: 5.71s	remaining: 8.29s
408:	learn: 3400.9839850	total: 5.73s	remaining: 8.28s
409:	learn: 3400.2217858	total: 5.75s	remaining: 8.27s
410:	learn: 3399.5435192	total: 5.76s	remaining: 8.26s
411:	learn: 3399.0118822	total: 5.77s	remaining: 8.24s
412:	learn: 3398.5473937	total: 5.78s	remaining: 8.22s
413:	learn: 3397.9498398	total: 5.8s	remaining: 8.21s
414:	learn: 3397.5188745	total: 5.81s	remaining: 8.19s
415:	learn: 3397.0111933	total: 5.82s	remaining: 8.18s
416:	learn: 3396.4047066	total: 5.83s	remaining: 8.16s
417:	learn: 3396.1001488	total: 5.84s	remaining: 8.14s
418:	learn: 3395.5677538	total: 5.86s	remaining: 8.12s
419:	learn: 3395.0291883	total: 5.87s	remaining: 8.11s
420:	learn: 3394.3409887	total: 5.88s	remaining: 8.09s
421:	learn: 3393.7025479	total: 5.9s	remaining: 8.08s
422:	learn: 3392.7109947	total: 5.93s	remaining: 8.09s
423:	learn: 3392.1471839	total: 5.95s	remaining: 8.08s
424:	learn: 3391.4204310	total: 5.97s	remaining: 8.07s
425:	learn: 3390.3193574	total: 5.99s	remaining: 8.07s
426:	learn: 3389.6086927	total: 6.01s	remaining: 8.07s
427:	learn: 3388.9897059	total: 6.04s	remaining: 8.07s
428:	learn: 3388.3082754	total: 6.07s	remaining: 8.08s
429:	learn: 3387.6516580	total: 6.1s	remaining: 8.08s
430:	learn: 3386.9689176	total: 6.12s	remaining: 8.08s
431:	learn: 3386.4577460	total: 6.15s	remaining: 8.08s
432:	learn: 3385.7186158	total: 6.17s	remaining: 8.09s
433:	learn: 3385.1299467	total: 6.2s	remaining: 8.08s
434:	learn: 3384.3160890	total: 6.22s	remaining: 8.09s
435:	learn: 3383.2057630	total: 6.26s	remaining: 8.1s
436:	learn: 3382.7078849	total: 6.27s	remaining: 8.08s
437:	learn: 3381.8097692	total: 6.28s	remaining: 8.06s
438:	learn: 3381.2897998	total: 6.29s	remaining: 8.04s
439:	learn: 3380.1627129	total: 6.3s	remaining: 8.02s
440:	learn: 3379.9744228	total: 6.31s	remaining: 8s
441:	learn: 3379.0786815	total: 6.32s	remaining: 7.98s
442:	learn: 3378.5784378	total: 6.33s	remaining: 7.96s
443:	learn: 3377.6890522	total: 6.34s	remaining: 7.93s
444:	learn: 3377.2001172	total: 6.35s	remaining: 7.92s
445:	learn: 3376.6939463	total: 6.36s	remaining: 7.9s
446:	learn: 3375.9749216	total: 6.37s	remaining: 7.88s
447:	learn: 3375.1632481	total: 6.38s	remaining: 7.87s
448:	learn: 3374.5866583	total: 6.39s	remaining: 7.85s
449:	learn: 3373.7227001	total: 6.4s	remaining: 7.83s
450:	learn: 3373.1111445	total: 6.41s	remaining: 7.8s
451:	learn: 3372.5014282	total: 6.42s	remaining: 7.78s
452:	learn: 3371.4744310	total: 6.43s	remaining: 7.76s
453:	learn: 3370.4564633	total: 6.44s	remaining: 7.74s
454:	learn: 3370.0679922	total: 6.45s	remaining: 7.72s
455:	learn: 3369.6855702	total: 6.46s	remaining: 7.7s
456:	learn: 3369.0827466	total: 6.46s	remaining: 7.68s
457:	learn: 3368.5251110	total: 6.47s	remaining: 7.66s
458:	learn: 3368.2220730	total: 6.48s	remaining: 7.64s
459:	learn: 3367.5326998	total: 6.49s	remaining: 7.62s
460:	learn: 3367.3005252	total: 6.5s	remaining: 7.6s
461:	learn: 3366.5982617	total: 6.51s	remaining: 7.58s
462:	learn: 3366.3003385	total: 6.52s	remaining: 7.56s
463:	learn: 3365.4667369	total: 6.53s	remaining: 7.54s
464:	learn: 3364.7942490	total: 6.54s	remaining: 7.52s
465:	learn: 3364.4023983	total: 6.56s	remaining: 7.51s
466:	learn: 3364.0205366	total: 6.57s	remaining: 7.5s
467:	learn: 3363.6326146	total: 6.58s	remaining: 7.48s
468:	learn: 3363.2736688	total: 6.58s	remaining: 7.45s
469:	learn: 3362.9251274	total: 6.59s	remaining: 7.43s
470:	learn: 3362.5107491	total: 6.6s	remaining: 7.42s
471:	learn: 3362.0869862	total: 6.61s	remaining: 7.4s
472:	learn: 3361.8587373	total: 6.62s	remaining: 7.38s
473:	learn: 3361.3160989	total: 6.63s	remaining: 7.36s
474:	learn: 3361.1003576	total: 6.64s	remaining: 7.34s
475:	learn: 3360.4559659	total: 6.65s	remaining: 7.32s
476:	learn: 3359.9312477	total: 6.66s	remaining: 7.3s
477:	learn: 3359.6989960	total: 6.68s	remaining: 7.29s
478:	learn: 3359.1697998	total: 6.7s	remaining: 7.28s
479:	learn: 3358.6847123	total: 6.7s	remaining: 7.26s
480:	learn: 3357.8496014	total: 6.71s	remaining: 7.24s
481:	learn: 3357.1111219	total: 6.72s	remaining: 7.23s
482:	learn: 3356.6624673	total: 6.73s	remaining: 7.21s
483:	learn: 3356.4442660	total: 6.74s	remaining: 7.19s
484:	learn: 3356.1149768	total: 6.75s	remaining: 7.17s
485:	learn: 3354.9282641	total: 6.77s	remaining: 7.16s
486:	learn: 3354.3685057	total: 6.78s	remaining: 7.14s
487:	learn: 3354.2596912	total: 6.79s	remaining: 7.13s
488:	learn: 3353.9491039	total: 6.8s	remaining: 7.11s
489:	learn: 3353.5034720	total: 6.81s	remaining: 7.09s
490:	learn: 3353.2120227	total: 6.82s	remaining: 7.07s
491:	learn: 3352.6888721	total: 6.83s	remaining: 7.05s
492:	learn: 3352.5895831	total: 6.83s	remaining: 7.03s
493:	learn: 3351.8376709	total: 6.84s	remaining: 7.01s
494:	learn: 3351.7589519	total: 6.85s	remaining: 6.99s
495:	learn: 3351.2425807	total: 6.86s	remaining: 6.97s
496:	learn: 3350.8543296	total: 6.87s	remaining: 6.95s
497:	learn: 3350.7518063	total: 6.88s	remaining: 6.93s
498:	learn: 3349.7534142	total: 6.89s	remaining: 6.91s
499:	learn: 3349.4228685	total: 6.89s	remaining: 6.89s
500:	learn: 3349.0351536	total: 6.9s	remaining: 6.88s
501:	learn: 3348.1596792	total: 6.91s	remaining: 6.86s
502:	learn: 3347.7822473	total: 6.92s	remaining: 6.84s
503:	learn: 3347.0734440	total: 6.93s	remaining: 6.82s
504:	learn: 3346.3532058	total: 6.95s	remaining: 6.81s
505:	learn: 3345.8348182	total: 6.96s	remaining: 6.8s
506:	learn: 3345.4721215	total: 6.97s	remaining: 6.78s
507:	learn: 3344.8776826	total: 6.98s	remaining: 6.76s
508:	learn: 3344.4265156	total: 6.99s	remaining: 6.74s
509:	learn: 3344.0943908	total: 7s	remaining: 6.73s
510:	learn: 3343.2022717	total: 7.01s	remaining: 6.71s
511:	learn: 3342.7632933	total: 7.02s	remaining: 6.69s
512:	learn: 3341.7585941	total: 7.03s	remaining: 6.67s
513:	learn: 3341.5587889	total: 7.04s	remaining: 6.65s
514:	learn: 3340.9903799	total: 7.04s	remaining: 6.63s
515:	learn: 3340.4104343	total: 7.05s	remaining: 6.62s
516:	learn: 3339.8186072	total: 7.06s	remaining: 6.6s
517:	learn: 3339.0355166	total: 7.07s	remaining: 6.58s
518:	learn: 3338.4573524	total: 7.08s	remaining: 6.56s
519:	learn: 3337.6830720	total: 7.09s	remaining: 6.54s
520:	learn: 3337.0671844	total: 7.1s	remaining: 6.53s
521:	learn: 3336.6280954	total: 7.11s	remaining: 6.51s
522:	learn: 3336.1862879	total: 7.12s	remaining: 6.49s
523:	learn: 3335.9331307	total: 7.13s	remaining: 6.47s
524:	learn: 3335.6298505	total: 7.13s	remaining: 6.45s
525:	learn: 3335.2930089	total: 7.14s	remaining: 6.44s
526:	learn: 3334.5216782	total: 7.15s	remaining: 6.42s
527:	learn: 3333.6101963	total: 7.16s	remaining: 6.4s
528:	learn: 3332.8962532	total: 7.18s	remaining: 6.39s
529:	learn: 3332.6669461	total: 7.19s	remaining: 6.37s
530:	learn: 3332.5055912	total: 7.2s	remaining: 6.36s
531:	learn: 3332.2196911	total: 7.21s	remaining: 6.34s
532:	learn: 3331.6101199	total: 7.21s	remaining: 6.32s
533:	learn: 3331.1861379	total: 7.22s	remaining: 6.3s
534:	learn: 3330.7031833	total: 7.23s	remaining: 6.29s
535:	learn: 3330.3866539	total: 7.24s	remaining: 6.27s
536:	learn: 3330.0270024	total: 7.25s	remaining: 6.25s
537:	learn: 3329.4228858	total: 7.26s	remaining: 6.23s
538:	learn: 3328.8843701	total: 7.27s	remaining: 6.21s
539:	learn: 3328.3471520	total: 7.27s	remaining: 6.2s
540:	learn: 3327.9563325	total: 7.29s	remaining: 6.19s
541:	learn: 3327.2240797	total: 7.3s	remaining: 6.17s
542:	learn: 3326.8283602	total: 7.31s	remaining: 6.15s
543:	learn: 3326.0350840	total: 7.32s	remaining: 6.14s
544:	learn: 3325.5165160	total: 7.33s	remaining: 6.12s
545:	learn: 3325.0532957	total: 7.34s	remaining: 6.1s
546:	learn: 3324.4188671	total: 7.35s	remaining: 6.08s
547:	learn: 3323.8024582	total: 7.36s	remaining: 6.07s
548:	learn: 3323.4638432	total: 7.37s	remaining: 6.06s
549:	learn: 3322.9087050	total: 7.38s	remaining: 6.04s
550:	learn: 3322.5576318	total: 7.39s	remaining: 6.02s
551:	learn: 3321.9446949	total: 7.4s	remaining: 6.01s
552:	learn: 3321.4271385	total: 7.41s	remaining: 5.99s
553:	learn: 3320.5773219	total: 7.42s	remaining: 5.97s
554:	learn: 3319.9870908	total: 7.43s	remaining: 5.95s
555:	learn: 3319.5969317	total: 7.43s	remaining: 5.94s
556:	learn: 3319.0410915	total: 7.44s	remaining: 5.92s
557:	learn: 3318.7220376	total: 7.45s	remaining: 5.9s
558:	learn: 3318.2623761	total: 7.46s	remaining: 5.89s
559:	learn: 3317.9166557	total: 7.47s	remaining: 5.87s
560:	learn: 3317.2467748	total: 7.48s	remaining: 5.85s
561:	learn: 3316.5379492	total: 7.49s	remaining: 5.83s
562:	learn: 3315.9160775	total: 7.5s	remaining: 5.82s
563:	learn: 3315.3343009	total: 7.5s	remaining: 5.8s
564:	learn: 3315.0934588	total: 7.51s	remaining: 5.78s
565:	learn: 3314.8285176	total: 7.52s	remaining: 5.77s
566:	learn: 3314.4311966	total: 7.53s	remaining: 5.75s
567:	learn: 3313.7689186	total: 7.54s	remaining: 5.73s
568:	learn: 3313.3909060	total: 7.55s	remaining: 5.72s
569:	learn: 3312.8879230	total: 7.56s	remaining: 5.7s
570:	learn: 3312.1453045	total: 7.57s	remaining: 5.68s
571:	learn: 3311.4816338	total: 7.59s	remaining: 5.68s
572:	learn: 3310.9388254	total: 7.61s	remaining: 5.67s
573:	learn: 3310.2669556	total: 7.62s	remaining: 5.65s
574:	learn: 3309.6073556	total: 7.63s	remaining: 5.64s
575:	learn: 3309.2875977	total: 7.63s	remaining: 5.62s
576:	learn: 3308.1995522	total: 7.64s	remaining: 5.61s
577:	learn: 3307.3925795	total: 7.66s	remaining: 5.59s
578:	learn: 3306.7388451	total: 7.67s	remaining: 5.58s
579:	learn: 3306.4074442	total: 7.68s	remaining: 5.56s
580:	learn: 3305.9245240	total: 7.69s	remaining: 5.54s
581:	learn: 3305.3688782	total: 7.7s	remaining: 5.53s
582:	learn: 3304.9982360	total: 7.71s	remaining: 5.51s
583:	learn: 3304.6959383	total: 7.71s	remaining: 5.5s
584:	learn: 3304.3384795	total: 7.72s	remaining: 5.48s
585:	learn: 3303.6640659	total: 7.73s	remaining: 5.46s
586:	learn: 3303.2835079	total: 7.74s	remaining: 5.45s
587:	learn: 3302.4710718	total: 7.75s	remaining: 5.43s
588:	learn: 3301.7416782	total: 7.76s	remaining: 5.41s
589:	learn: 3301.2989813	total: 7.77s	remaining: 5.4s
590:	learn: 3300.7697794	total: 7.79s	remaining: 5.39s
591:	learn: 3300.1370396	total: 7.79s	remaining: 5.37s
592:	learn: 3299.7169941	total: 7.8s	remaining: 5.36s
593:	learn: 3299.5337374	total: 7.81s	remaining: 5.34s
594:	learn: 3298.8524768	total: 7.82s	remaining: 5.32s
595:	learn: 3298.4769914	total: 7.83s	remaining: 5.31s
596:	learn: 3298.0099454	total: 7.84s	remaining: 5.29s
597:	learn: 3297.2770273	total: 7.85s	remaining: 5.28s
598:	learn: 3296.6874119	total: 7.86s	remaining: 5.26s
599:	learn: 3296.0695810	total: 7.87s	remaining: 5.24s
600:	learn: 3295.4675830	total: 7.87s	remaining: 5.23s
601:	learn: 3295.0372525	total: 7.89s	remaining: 5.21s
602:	learn: 3294.5561775	total: 7.89s	remaining: 5.2s
603:	learn: 3293.9290092	total: 7.9s	remaining: 5.18s
604:	learn: 3293.3130998	total: 7.91s	remaining: 5.17s
605:	learn: 3292.8458996	total: 7.92s	remaining: 5.15s
606:	learn: 3292.5161451	total: 7.93s	remaining: 5.13s
607:	learn: 3292.1500402	total: 7.94s	remaining: 5.12s
608:	learn: 3291.7943458	total: 7.95s	remaining: 5.1s
609:	learn: 3291.4329701	total: 7.96s	remaining: 5.09s
610:	learn: 3291.0967195	total: 7.96s	remaining: 5.07s
611:	learn: 3290.5815175	total: 7.97s	remaining: 5.05s
612:	learn: 3290.1074206	total: 7.99s	remaining: 5.04s
613:	learn: 3289.7963860	total: 8s	remaining: 5.03s
614:	learn: 3289.3154066	total: 8.01s	remaining: 5.01s
615:	learn: 3288.8704344	total: 8.02s	remaining: 5s
616:	learn: 3288.2273805	total: 8.03s	remaining: 4.98s
617:	learn: 3287.6043420	total: 8.04s	remaining: 4.97s
618:	learn: 3287.0453677	total: 8.05s	remaining: 4.96s
619:	learn: 3286.5838008	total: 8.06s	remaining: 4.94s
620:	learn: 3285.9675564	total: 8.07s	remaining: 4.92s
621:	learn: 3285.6753378	total: 8.08s	remaining: 4.91s
622:	learn: 3285.0743203	total: 8.09s	remaining: 4.89s
623:	learn: 3284.1161016	total: 8.1s	remaining: 4.88s
624:	learn: 3283.3796664	total: 8.11s	remaining: 4.86s
625:	learn: 3282.1961681	total: 8.11s	remaining: 4.85s
626:	learn: 3281.7897223	total: 8.12s	remaining: 4.83s
627:	learn: 3281.5021923	total: 8.13s	remaining: 4.82s
628:	learn: 3281.0716488	total: 8.14s	remaining: 4.8s
629:	learn: 3280.6713379	total: 8.15s	remaining: 4.79s
630:	learn: 3280.3209626	total: 8.16s	remaining: 4.77s
631:	learn: 3279.9926859	total: 8.17s	remaining: 4.75s
632:	learn: 3279.7297390	total: 8.18s	remaining: 4.74s
633:	learn: 3279.4113748	total: 8.19s	remaining: 4.72s
634:	learn: 3278.8114031	total: 8.2s	remaining: 4.71s
635:	learn: 3278.3608082	total: 8.21s	remaining: 4.7s
636:	learn: 3277.7208253	total: 8.22s	remaining: 4.68s
637:	learn: 3277.3680969	total: 8.23s	remaining: 4.67s
638:	learn: 3276.7193569	total: 8.23s	remaining: 4.65s
639:	learn: 3276.4470347	total: 8.24s	remaining: 4.64s
640:	learn: 3275.4275629	total: 8.25s	remaining: 4.62s
641:	learn: 3275.0629608	total: 8.26s	remaining: 4.61s
642:	learn: 3274.6452960	total: 8.27s	remaining: 4.59s
643:	learn: 3274.3667745	total: 8.28s	remaining: 4.58s
644:	learn: 3273.9254311	total: 8.29s	remaining: 4.56s
645:	learn: 3273.3048344	total: 8.29s	remaining: 4.54s
646:	learn: 3272.5855210	total: 8.3s	remaining: 4.53s
647:	learn: 3272.0793132	total: 8.31s	remaining: 4.51s
648:	learn: 3271.3449770	total: 8.32s	remaining: 4.5s
649:	learn: 3270.9042463	total: 8.33s	remaining: 4.49s
650:	learn: 3270.4838722	total: 8.34s	remaining: 4.47s
651:	learn: 3270.1648158	total: 8.35s	remaining: 4.46s
652:	learn: 3269.6236533	total: 8.36s	remaining: 4.44s
653:	learn: 3269.2902582	total: 8.37s	remaining: 4.43s
654:	learn: 3268.7866809	total: 8.38s	remaining: 4.42s
655:	learn: 3268.0161241	total: 8.4s	remaining: 4.4s
656:	learn: 3267.5210800	total: 8.41s	remaining: 4.39s
657:	learn: 3266.9187106	total: 8.42s	remaining: 4.38s
658:	learn: 3266.2912985	total: 8.43s	remaining: 4.36s
659:	learn: 3265.9757161	total: 8.44s	remaining: 4.35s
660:	learn: 3265.4400024	total: 8.45s	remaining: 4.33s
661:	learn: 3265.1041767	total: 8.46s	remaining: 4.32s
662:	learn: 3264.6296381	total: 8.47s	remaining: 4.3s
663:	learn: 3264.3127567	total: 8.48s	remaining: 4.29s
664:	learn: 3264.0118522	total: 8.48s	remaining: 4.27s
665:	learn: 3263.4368441	total: 8.49s	remaining: 4.26s
666:	learn: 3262.2778392	total: 8.5s	remaining: 4.25s
667:	learn: 3261.7650228	total: 8.51s	remaining: 4.23s
668:	learn: 3261.5028095	total: 8.52s	remaining: 4.22s
669:	learn: 3261.0055669	total: 8.53s	remaining: 4.2s
670:	learn: 3260.7904492	total: 8.54s	remaining: 4.19s
671:	learn: 3260.3582462	total: 8.55s	remaining: 4.17s
672:	learn: 3259.4300933	total: 8.56s	remaining: 4.16s
673:	learn: 3258.9920457	total: 8.56s	remaining: 4.14s
674:	learn: 3258.5070367	total: 8.58s	remaining: 4.13s
675:	learn: 3258.3025024	total: 8.59s	remaining: 4.11s
676:	learn: 3257.9922497	total: 8.59s	remaining: 4.1s
677:	learn: 3257.2108602	total: 8.61s	remaining: 4.09s
678:	learn: 3256.8291893	total: 8.62s	remaining: 4.07s
679:	learn: 3256.4794396	total: 8.63s	remaining: 4.06s
680:	learn: 3256.0331333	total: 8.64s	remaining: 4.04s
681:	learn: 3255.6399496	total: 8.65s	remaining: 4.03s
682:	learn: 3255.0305444	total: 8.67s	remaining: 4.02s
683:	learn: 3254.5423735	total: 8.68s	remaining: 4.01s
684:	learn: 3253.8853697	total: 8.69s	remaining: 4s
685:	learn: 3253.3008630	total: 8.7s	remaining: 3.98s
686:	learn: 3253.0548427	total: 8.71s	remaining: 3.97s
687:	learn: 3252.8667957	total: 8.72s	remaining: 3.95s
688:	learn: 3252.4019003	total: 8.72s	remaining: 3.94s
689:	learn: 3251.7830205	total: 8.73s	remaining: 3.92s
690:	learn: 3251.1694442	total: 8.74s	remaining: 3.91s
691:	learn: 3250.7900954	total: 8.75s	remaining: 3.9s
692:	learn: 3250.4114423	total: 8.76s	remaining: 3.88s
693:	learn: 3250.0559453	total: 8.77s	remaining: 3.87s
694:	learn: 3249.4504492	total: 8.78s	remaining: 3.85s
695:	learn: 3249.1201097	total: 8.79s	remaining: 3.84s
696:	learn: 3248.7216208	total: 8.8s	remaining: 3.83s
697:	learn: 3248.1373643	total: 8.82s	remaining: 3.82s
698:	learn: 3247.7797878	total: 8.83s	remaining: 3.8s
699:	learn: 3247.3881155	total: 8.84s	remaining: 3.79s
700:	learn: 3246.8878865	total: 8.85s	remaining: 3.77s
701:	learn: 3246.3753721	total: 8.86s	remaining: 3.76s
702:	learn: 3245.8748052	total: 8.87s	remaining: 3.75s
703:	learn: 3245.7007047	total: 8.88s	remaining: 3.73s
704:	learn: 3245.2983059	total: 8.89s	remaining: 3.72s
705:	learn: 3244.8545778	total: 8.9s	remaining: 3.7s
706:	learn: 3244.4852460	total: 8.91s	remaining: 3.69s
707:	learn: 3244.0856602	total: 8.91s	remaining: 3.68s
708:	learn: 3243.6286605	total: 8.92s	remaining: 3.66s
709:	learn: 3243.1293667	total: 8.93s	remaining: 3.65s
710:	learn: 3242.6385677	total: 8.94s	remaining: 3.63s
711:	learn: 3241.9572158	total: 8.95s	remaining: 3.62s
712:	learn: 3241.4063712	total: 8.96s	remaining: 3.61s
713:	learn: 3240.9637244	total: 8.97s	remaining: 3.59s
714:	learn: 3240.5730096	total: 8.98s	remaining: 3.58s
715:	learn: 3240.0894198	total: 8.98s	remaining: 3.56s
716:	learn: 3239.5603764	total: 8.99s	remaining: 3.55s
717:	learn: 3239.2261422	total: 9.01s	remaining: 3.54s
718:	learn: 3238.7145422	total: 9.02s	remaining: 3.52s
719:	learn: 3238.4992620	total: 9.03s	remaining: 3.51s
720:	learn: 3238.1641004	total: 9.04s	remaining: 3.5s
721:	learn: 3237.6743870	total: 9.05s	remaining: 3.48s
722:	learn: 3237.3203535	total: 9.06s	remaining: 3.47s
723:	learn: 3237.1653282	total: 9.07s	remaining: 3.46s
724:	learn: 3236.6395813	total: 9.07s	remaining: 3.44s
725:	learn: 3236.5001014	total: 9.08s	remaining: 3.43s
726:	learn: 3235.8737600	total: 9.09s	remaining: 3.41s
727:	learn: 3235.0857336	total: 9.1s	remaining: 3.4s
728:	learn: 3234.6336748	total: 9.11s	remaining: 3.39s
729:	learn: 3233.9426625	total: 9.12s	remaining: 3.37s
730:	learn: 3233.3082293	total: 9.13s	remaining: 3.36s
731:	learn: 3232.4403293	total: 9.14s	remaining: 3.35s
732:	learn: 3232.1674359	total: 9.14s	remaining: 3.33s
733:	learn: 3231.6098225	total: 9.15s	remaining: 3.32s
734:	learn: 3231.2640076	total: 9.16s	remaining: 3.3s
735:	learn: 3230.9084053	total: 9.17s	remaining: 3.29s
736:	learn: 3230.0865415	total: 9.18s	remaining: 3.28s
737:	learn: 3229.3177012	total: 9.19s	remaining: 3.26s
738:	learn: 3228.8959215	total: 9.2s	remaining: 3.25s
739:	learn: 3228.5053743	total: 9.21s	remaining: 3.23s
740:	learn: 3228.1604289	total: 9.23s	remaining: 3.23s
741:	learn: 3227.9751234	total: 9.24s	remaining: 3.21s
742:	learn: 3227.5384299	total: 9.25s	remaining: 3.2s
743:	learn: 3227.0458390	total: 9.26s	remaining: 3.19s
744:	learn: 3226.1505683	total: 9.27s	remaining: 3.17s
745:	learn: 3225.6465325	total: 9.28s	remaining: 3.16s
746:	learn: 3225.4729007	total: 9.29s	remaining: 3.15s
747:	learn: 3225.0298993	total: 9.3s	remaining: 3.13s
748:	learn: 3224.6594402	total: 9.31s	remaining: 3.12s
749:	learn: 3224.6594402	total: 9.32s	remaining: 3.1s
750:	learn: 3223.8018322	total: 9.33s	remaining: 3.09s
751:	learn: 3223.8018322	total: 9.34s	remaining: 3.08s
752:	learn: 3223.0464904	total: 9.35s	remaining: 3.06s
753:	learn: 3222.6061358	total: 9.36s	remaining: 3.05s
754:	learn: 3221.9849067	total: 9.36s	remaining: 3.04s
755:	learn: 3220.8461474	total: 9.37s	remaining: 3.02s
756:	learn: 3220.4371882	total: 9.38s	remaining: 3.01s
757:	learn: 3219.5415903	total: 9.39s	remaining: 3s
758:	learn: 3218.9127636	total: 9.4s	remaining: 2.98s
759:	learn: 3218.2140286	total: 9.42s	remaining: 2.97s
760:	learn: 3218.1869377	total: 9.43s	remaining: 2.96s
761:	learn: 3218.0324111	total: 9.44s	remaining: 2.95s
762:	learn: 3217.6492068	total: 9.45s	remaining: 2.93s
763:	learn: 3216.8183647	total: 9.46s	remaining: 2.92s
764:	learn: 3216.4458678	total: 9.46s	remaining: 2.91s
765:	learn: 3216.2851700	total: 9.47s	remaining: 2.89s
766:	learn: 3216.1369099	total: 9.48s	remaining: 2.88s
767:	learn: 3215.5332034	total: 9.5s	remaining: 2.87s
768:	learn: 3215.1890929	total: 9.51s	remaining: 2.85s
769:	learn: 3215.0288189	total: 9.52s	remaining: 2.84s
770:	learn: 3214.9328400	total: 9.52s	remaining: 2.83s
771:	learn: 3214.3904766	total: 9.53s	remaining: 2.81s
772:	learn: 3213.8903774	total: 9.54s	remaining: 2.8s
773:	learn: 3213.3931335	total: 9.55s	remaining: 2.79s
774:	learn: 3212.8813968	total: 9.56s	remaining: 2.78s
775:	learn: 3212.6548534	total: 9.57s	remaining: 2.76s
776:	learn: 3212.1120786	total: 9.58s	remaining: 2.75s
777:	learn: 3211.8057708	total: 9.59s	remaining: 2.74s
778:	learn: 3211.5830460	total: 9.6s	remaining: 2.72s
779:	learn: 3210.8119389	total: 9.61s	remaining: 2.71s
780:	learn: 3210.6532231	total: 9.62s	remaining: 2.7s
781:	learn: 3210.2196553	total: 9.63s	remaining: 2.69s
782:	learn: 3209.7694713	total: 9.65s	remaining: 2.67s
783:	learn: 3209.0593534	total: 9.66s	remaining: 2.66s
784:	learn: 3208.8839705	total: 9.67s	remaining: 2.65s
785:	learn: 3208.5579668	total: 9.68s	remaining: 2.63s
786:	learn: 3208.1495239	total: 9.69s	remaining: 2.62s
787:	learn: 3207.5411647	total: 9.7s	remaining: 2.61s
788:	learn: 3207.0791024	total: 9.71s	remaining: 2.6s
789:	learn: 3206.6243768	total: 9.71s	remaining: 2.58s
790:	learn: 3206.1079241	total: 9.72s	remaining: 2.57s
791:	learn: 3205.6231647	total: 9.73s	remaining: 2.56s
792:	learn: 3205.2144821	total: 9.74s	remaining: 2.54s
793:	learn: 3204.8284830	total: 9.75s	remaining: 2.53s
794:	learn: 3204.6478651	total: 9.76s	remaining: 2.52s
795:	learn: 3204.4861247	total: 9.77s	remaining: 2.5s
796:	learn: 3204.1853825	total: 9.78s	remaining: 2.49s
797:	learn: 3203.6601524	total: 9.78s	remaining: 2.48s
798:	learn: 3203.1316432	total: 9.79s	remaining: 2.46s
799:	learn: 3202.7817970	total: 9.8s	remaining: 2.45s
800:	learn: 3202.6463595	total: 9.81s	remaining: 2.44s
801:	learn: 3202.2131059	total: 9.83s	remaining: 2.43s
802:	learn: 3201.6848150	total: 9.84s	remaining: 2.41s
803:	learn: 3201.1869668	total: 9.85s	remaining: 2.4s
804:	learn: 3200.7057534	total: 9.86s	remaining: 2.39s
805:	learn: 3200.3466048	total: 9.87s	remaining: 2.38s
806:	learn: 3199.9260890	total: 9.88s	remaining: 2.36s
807:	learn: 3199.4823058	total: 9.89s	remaining: 2.35s
808:	learn: 3198.8571012	total: 9.9s	remaining: 2.34s
809:	learn: 3198.0830740	total: 9.91s	remaining: 2.32s
810:	learn: 3197.8497534	total: 9.92s	remaining: 2.31s
811:	learn: 3197.4545455	total: 9.93s	remaining: 2.3s
812:	learn: 3197.1666053	total: 9.94s	remaining: 2.29s
813:	learn: 3196.8372951	total: 9.95s	remaining: 2.27s
814:	learn: 3196.1389008	total: 9.95s	remaining: 2.26s
815:	learn: 3195.7029138	total: 9.96s	remaining: 2.25s
816:	learn: 3194.9777638	total: 9.97s	remaining: 2.23s
817:	learn: 3194.3827972	total: 9.98s	remaining: 2.22s
818:	learn: 3193.9652334	total: 9.99s	remaining: 2.21s
819:	learn: 3193.3594871	total: 10s	remaining: 2.19s
820:	learn: 3192.9802612	total: 10s	remaining: 2.18s
821:	learn: 3192.1458016	total: 10s	remaining: 2.17s
822:	learn: 3191.8452915	total: 10s	remaining: 2.16s
823:	learn: 3191.6429848	total: 10s	remaining: 2.14s
824:	learn: 3191.1723813	total: 10.1s	remaining: 2.13s
825:	learn: 3190.5621677	total: 10.1s	remaining: 2.12s
826:	learn: 3190.1848531	total: 10.1s	remaining: 2.11s
827:	learn: 3189.6042959	total: 10.1s	remaining: 2.09s
828:	learn: 3189.1360217	total: 10.1s	remaining: 2.08s
829:	learn: 3188.8650569	total: 10.1s	remaining: 2.07s
830:	learn: 3188.2519067	total: 10.1s	remaining: 2.06s
831:	learn: 3188.0700053	total: 10.1s	remaining: 2.04s
832:	learn: 3187.7464203	total: 10.1s	remaining: 2.03s
833:	learn: 3187.5744383	total: 10.1s	remaining: 2.02s
834:	learn: 3187.3450390	total: 10.1s	remaining: 2s
835:	learn: 3186.7443205	total: 10.2s	remaining: 1.99s
836:	learn: 3186.4302391	total: 10.2s	remaining: 1.98s
837:	learn: 3185.8828992	total: 10.2s	remaining: 1.97s
838:	learn: 3185.7602208	total: 10.2s	remaining: 1.95s
839:	learn: 3185.3796589	total: 10.2s	remaining: 1.94s
840:	learn: 3185.0082958	total: 10.2s	remaining: 1.93s
841:	learn: 3184.6629567	total: 10.2s	remaining: 1.92s
842:	learn: 3184.0998466	total: 10.2s	remaining: 1.9s
843:	learn: 3183.6935252	total: 10.2s	remaining: 1.89s
844:	learn: 3183.0116326	total: 10.2s	remaining: 1.88s
845:	learn: 3182.5456935	total: 10.3s	remaining: 1.87s
846:	learn: 3181.9157699	total: 10.3s	remaining: 1.85s
847:	learn: 3181.4572647	total: 10.3s	remaining: 1.84s
848:	learn: 3180.7624290	total: 10.3s	remaining: 1.83s
849:	learn: 3180.4784530	total: 10.3s	remaining: 1.81s
850:	learn: 3179.7149139	total: 10.3s	remaining: 1.8s
851:	learn: 3179.5753572	total: 10.3s	remaining: 1.79s
852:	learn: 3179.2725286	total: 10.3s	remaining: 1.78s
853:	learn: 3178.9491251	total: 10.3s	remaining: 1.76s
854:	learn: 3178.4365031	total: 10.3s	remaining: 1.75s
855:	learn: 3178.1405475	total: 10.3s	remaining: 1.74s
856:	learn: 3177.2778056	total: 10.3s	remaining: 1.73s
857:	learn: 3176.8836684	total: 10.4s	remaining: 1.71s
858:	learn: 3176.4566768	total: 10.4s	remaining: 1.7s
859:	learn: 3176.2589878	total: 10.4s	remaining: 1.69s
860:	learn: 3175.9212116	total: 10.4s	remaining: 1.68s
861:	learn: 3175.6509128	total: 10.4s	remaining: 1.66s
862:	learn: 3175.5143581	total: 10.4s	remaining: 1.65s
863:	learn: 3175.1966469	total: 10.4s	remaining: 1.64s
864:	learn: 3174.9511561	total: 10.4s	remaining: 1.63s
865:	learn: 3174.5547279	total: 10.4s	remaining: 1.61s
866:	learn: 3174.3778494	total: 10.4s	remaining: 1.6s
867:	learn: 3173.8771768	total: 10.5s	remaining: 1.59s
868:	learn: 3173.0127602	total: 10.5s	remaining: 1.58s
869:	learn: 3172.6837312	total: 10.5s	remaining: 1.56s
870:	learn: 3171.8749821	total: 10.5s	remaining: 1.55s
871:	learn: 3171.3861351	total: 10.5s	remaining: 1.54s
872:	learn: 3171.2254451	total: 10.5s	remaining: 1.53s
873:	learn: 3170.8717533	total: 10.5s	remaining: 1.52s
874:	learn: 3170.6732775	total: 10.5s	remaining: 1.5s
875:	learn: 3170.4754688	total: 10.5s	remaining: 1.49s
876:	learn: 3169.9504145	total: 10.6s	remaining: 1.48s
877:	learn: 3169.7957935	total: 10.6s	remaining: 1.47s
878:	learn: 3169.5586403	total: 10.6s	remaining: 1.46s
879:	learn: 3169.4093751	total: 10.6s	remaining: 1.44s
880:	learn: 3168.5542012	total: 10.6s	remaining: 1.43s
881:	learn: 3168.3692017	total: 10.6s	remaining: 1.42s
882:	learn: 3167.9808469	total: 10.6s	remaining: 1.41s
883:	learn: 3167.7836393	total: 10.6s	remaining: 1.39s
884:	learn: 3167.3542015	total: 10.6s	remaining: 1.38s
885:	learn: 3166.8563613	total: 10.6s	remaining: 1.37s
886:	learn: 3166.2218170	total: 10.7s	remaining: 1.36s
887:	learn: 3166.0338203	total: 10.7s	remaining: 1.34s
888:	learn: 3165.5910248	total: 10.7s	remaining: 1.33s
889:	learn: 3165.3095408	total: 10.7s	remaining: 1.32s
890:	learn: 3165.0269430	total: 10.7s	remaining: 1.31s
891:	learn: 3164.9049675	total: 10.7s	remaining: 1.3s
892:	learn: 3164.3100690	total: 10.7s	remaining: 1.28s
893:	learn: 3164.2474562	total: 10.7s	remaining: 1.27s
894:	learn: 3164.0093184	total: 10.7s	remaining: 1.26s
895:	learn: 3163.6943238	total: 10.7s	remaining: 1.25s
896:	learn: 3163.1257010	total: 10.8s	remaining: 1.23s
897:	learn: 3162.3330038	total: 10.8s	remaining: 1.22s
898:	learn: 3162.1375807	total: 10.8s	remaining: 1.21s
899:	learn: 3161.8432157	total: 10.8s	remaining: 1.2s
900:	learn: 3161.4753947	total: 10.8s	remaining: 1.18s
901:	learn: 3161.2150507	total: 10.8s	remaining: 1.17s
902:	learn: 3161.0957528	total: 10.8s	remaining: 1.16s
903:	learn: 3160.9247267	total: 10.8s	remaining: 1.15s
904:	learn: 3160.4182135	total: 10.8s	remaining: 1.14s
905:	learn: 3160.3183768	total: 10.8s	remaining: 1.12s
906:	learn: 3159.7922220	total: 10.8s	remaining: 1.11s
907:	learn: 3159.5805759	total: 10.8s	remaining: 1.1s
908:	learn: 3159.3229638	total: 10.9s	remaining: 1.09s
909:	learn: 3159.0496577	total: 10.9s	remaining: 1.07s
910:	learn: 3158.4812306	total: 10.9s	remaining: 1.06s
911:	learn: 3158.1968592	total: 10.9s	remaining: 1.05s
912:	learn: 3157.8620002	total: 10.9s	remaining: 1.04s
913:	learn: 3157.3067589	total: 10.9s	remaining: 1.03s
914:	learn: 3156.6733621	total: 10.9s	remaining: 1.01s
915:	learn: 3156.3460983	total: 10.9s	remaining: 1s
916:	learn: 3155.9919032	total: 10.9s	remaining: 990ms
917:	learn: 3155.6201933	total: 10.9s	remaining: 977ms
918:	learn: 3155.3095323	total: 10.9s	remaining: 965ms
919:	learn: 3155.0581945	total: 11s	remaining: 953ms
920:	learn: 3154.6960151	total: 11s	remaining: 941ms
921:	learn: 3154.3028831	total: 11s	remaining: 929ms
922:	learn: 3154.0758619	total: 11s	remaining: 917ms
923:	learn: 3153.7998527	total: 11s	remaining: 905ms
924:	learn: 3153.1894779	total: 11s	remaining: 892ms
925:	learn: 3152.8002244	total: 11s	remaining: 880ms
926:	learn: 3152.4232649	total: 11s	remaining: 868ms
927:	learn: 3152.2378276	total: 11s	remaining: 856ms
928:	learn: 3151.8162118	total: 11s	remaining: 844ms
929:	learn: 3151.4999806	total: 11.1s	remaining: 832ms
930:	learn: 3151.3609021	total: 11.1s	remaining: 820ms
931:	learn: 3151.0354001	total: 11.1s	remaining: 808ms
932:	learn: 3150.4907586	total: 11.1s	remaining: 796ms
933:	learn: 3150.0093509	total: 11.1s	remaining: 784ms
934:	learn: 3149.2077113	total: 11.1s	remaining: 772ms
935:	learn: 3149.0331656	total: 11.1s	remaining: 760ms
936:	learn: 3148.5794165	total: 11.1s	remaining: 748ms
937:	learn: 3148.0375386	total: 11.1s	remaining: 736ms
938:	learn: 3147.7186066	total: 11.1s	remaining: 724ms
939:	learn: 3147.2687409	total: 11.1s	remaining: 712ms
940:	learn: 3146.9996535	total: 11.2s	remaining: 700ms
941:	learn: 3146.6909187	total: 11.2s	remaining: 687ms
942:	learn: 3146.4755719	total: 11.2s	remaining: 675ms
943:	learn: 3146.0448667	total: 11.2s	remaining: 663ms
944:	learn: 3145.5600021	total: 11.2s	remaining: 651ms
945:	learn: 3144.8072155	total: 11.2s	remaining: 639ms
946:	learn: 3144.3444702	total: 11.2s	remaining: 627ms
947:	learn: 3144.2099260	total: 11.2s	remaining: 615ms
948:	learn: 3143.8533695	total: 11.2s	remaining: 603ms
949:	learn: 3143.3470492	total: 11.2s	remaining: 591ms
950:	learn: 3143.0944372	total: 11.2s	remaining: 579ms
951:	learn: 3142.7412716	total: 11.3s	remaining: 567ms
952:	learn: 3142.5236329	total: 11.3s	remaining: 555ms
953:	learn: 3142.0586182	total: 11.3s	remaining: 544ms
954:	learn: 3141.6279612	total: 11.3s	remaining: 532ms
955:	learn: 3141.3088330	total: 11.3s	remaining: 520ms
956:	learn: 3140.7742423	total: 11.3s	remaining: 508ms
957:	learn: 3140.3143849	total: 11.3s	remaining: 496ms
958:	learn: 3139.8428266	total: 11.3s	remaining: 484ms
959:	learn: 3139.0963007	total: 11.3s	remaining: 472ms
960:	learn: 3138.7238264	total: 11.3s	remaining: 460ms
961:	learn: 3137.8708695	total: 11.4s	remaining: 448ms
962:	learn: 3137.5812402	total: 11.4s	remaining: 437ms
963:	learn: 3137.1243750	total: 11.4s	remaining: 425ms
964:	learn: 3136.9009619	total: 11.4s	remaining: 413ms
965:	learn: 3136.3956427	total: 11.4s	remaining: 401ms
966:	learn: 3136.1308418	total: 11.4s	remaining: 389ms
967:	learn: 3135.4548714	total: 11.4s	remaining: 377ms
968:	learn: 3135.1345326	total: 11.4s	remaining: 365ms
969:	learn: 3134.8858238	total: 11.4s	remaining: 353ms
970:	learn: 3134.5963519	total: 11.4s	remaining: 341ms
971:	learn: 3133.9972587	total: 11.4s	remaining: 330ms
972:	learn: 3133.7298184	total: 11.4s	remaining: 318ms
973:	learn: 3133.3443588	total: 11.5s	remaining: 306ms
974:	learn: 3133.0808545	total: 11.5s	remaining: 294ms
975:	learn: 3132.8276898	total: 11.5s	remaining: 282ms
976:	learn: 3132.2477431	total: 11.5s	remaining: 271ms
977:	learn: 3131.9757700	total: 11.5s	remaining: 259ms
978:	learn: 3131.4825735	total: 11.5s	remaining: 247ms
979:	learn: 3130.8529596	total: 11.5s	remaining: 235ms
980:	learn: 3130.4303715	total: 11.5s	remaining: 223ms
981:	learn: 3130.1813024	total: 11.5s	remaining: 212ms
982:	learn: 3129.8785524	total: 11.6s	remaining: 200ms
983:	learn: 3129.4780396	total: 11.6s	remaining: 188ms
984:	learn: 3129.2027881	total: 11.6s	remaining: 176ms
985:	learn: 3128.6822066	total: 11.6s	remaining: 165ms
986:	learn: 3128.3459373	total: 11.6s	remaining: 153ms
987:	learn: 3127.9879198	total: 11.6s	remaining: 141ms
988:	learn: 3127.8348883	total: 11.6s	remaining: 129ms
989:	learn: 3127.5793664	total: 11.6s	remaining: 118ms
990:	learn: 3127.1377315	total: 11.6s	remaining: 106ms
991:	learn: 3126.9225652	total: 11.7s	remaining: 94ms
992:	learn: 3126.5736199	total: 11.7s	remaining: 82.2ms
993:	learn: 3126.1406653	total: 11.7s	remaining: 70.5ms
994:	learn: 3125.9306015	total: 11.7s	remaining: 58.7ms
995:	learn: 3125.4908455	total: 11.7s	remaining: 47ms
996:	learn: 3124.9577269	total: 11.7s	remaining: 35.2ms
997:	learn: 3124.7122484	total: 11.7s	remaining: 23.5ms
998:	learn: 3124.4734193	total: 11.7s	remaining: 11.7ms
999:	learn: 3124.2555094	total: 11.7s	remaining: 0us
In [312]:
cv_metricas.columns
Out[312]:
Index(['<catboost.core.CatBoostRegressor object at 0x7f9bdd3ca910>'], dtype='object')
In [315]:
#Renombramos la columna de la tabla de resultados
cv_metricas = cv_metricas.rename(columns={'<catboost.core.CatBoostRegressor object at 0x7f9bdd3ca910>':'CatBoostRegressor'})
In [316]:
#Modificamos el formato de la columna
cv_metricas['CatBoostRegressor'] = cv_metricas['CatBoostRegressor'].map('{:,.4f}'.format)
In [317]:
cv_metricas
Out[317]:
CatBoostRegressor
Mean Absolute Error 3,552.0000
Mean Squared Error 96,902,791.0000
R^2 0.8316

Vemos que el valor del MAE se redujo a $3683 y el R^2 se mantuvo prácticamente igual al del CV realizado cuando se hizo la competencia entre varios modelos.

Hyperparameter tuning¶

Realizaremos la selección de los hiperparámetros óptimos para nuestro modelo, empleando el método nativo de Catboost de RandomizedSearch.

In [ ]:
train_data = Pool(data=X2,
                  label=y2,
                  cat_features=cat_features
                 )

cat_model = CatBoostRegressor(loss_function= 'MAE')

grid = {'learning_rate': [0.03, 0.1],
        'depth': [6, 8, 10],
        'l2_leaf_reg': [1, 3, 5, 7, 9],}
        


randomized_search_results = cat_model.randomized_search(
    grid,
    train_data,
    n_iter=12,
    shuffle=False,
)

Queremos conocer cuáles son los hiperparámetros óptimos:

In [ ]:
randomized_search_results['params']
Out[ ]:
{'depth': 8, 'l2_leaf_reg': 3, 'learning_rate': 0.03}

Entrenamiento final¶

Split de la data.

In [318]:
X_train, X_test, y_train, y_test = train_test_split(X2,y2, test_size=0.30,random_state=42)

Creación del objeto Pool.

In [319]:
#Creamos el objeto pool para el dataset de entrenamiento. Le damos información sobre las columnas
#categóricas al parámetro cat_features
train_data = Pool(data=X_train,
                  label=y_train,
                  cat_features=cat_features
                 )
#Creamos el objeto pool para el dataset de evaluación
test_data = Pool(data=X_test,
                  label=y_test,
                  cat_features=cat_features
                 )

Creación y entrenamiento del modelo.

In [ ]:
#Construimos el modelo con los hiperparámetros óptimos
cat_model = CatBoostRegressor(loss_function= 'MAE', depth= 8, l2_leaf_reg=3, learning_rate= 0.03 )
# Entrenamos el modelo 
cat_model.fit( X_train, y_train,
               eval_set=(X_test, y_test),
              )

Feature importances¶

In [321]:
# Creamos el dataset de feature importances
df_feature_importance = pd.DataFrame(cat_model.get_feature_importance(prettified=True))
#Graficamos feature importances
plt.figure(figsize=(12, 6));
feature_plot= sns.barplot(x="Importances", y="Feature Id", data=df_feature_importance,palette="cool");
plt.title('features importance');

Del gráfico observamos que las tres variables más importantes resultan ser Horas semanales trabajadas (recordemos que del EDA no pudimos obtener una conclusión definitiva sobre ésta), Región de residencia y Tamaño de la empresa, seguida por la edad del trabajador.

Por su parte, la variable Estado (si la persona tiene trabajo o no) resultó ser de importancia nula. Las siguientes variables menos importantes son Jerarquía ocupacional (Si la persona es Directivo, Jefe, Asalariado o Cuentapropista) y el tamaño del aglomerado urbano de residencia (si tiene más de 500 mil habitantes o no).

Predicciones¶

In [322]:
#Calculamos las predicciones
y_predict= cat_model.predict(X_test)

Reemplazamos los valores negativos de las predicciones:

In [323]:
#Reemplazamos por 0 las predicciones negativas 
y_predict = np.where((y_predict <0 ), 0, y_predict)

Creamos una tabla para graficar las predicciones vs. los valores reales de ingresos:

In [324]:
tabla_resumen = X_test.copy()
tabla_resumen['Data real'] = y_test
In [325]:
#Añadimos las predicciones del modelo final a nuestra tabla
tabla_resumen['Predicciones'] = y_predict
In [326]:
tabla_resumen.sample(10)
Out[326]:
REGION AGLOMERADO MAS_500 Sexo Edad NIVEL_ED_2 NIVEL_ED ESTADO CAT_OCUP Cant_Ocup ... Lugar_trab Tamaño_empr_2 CARACTER_OCUP JERARQUIA_OCUP TECNOLOGIA_OCUP CALIFICACION_OCUP ACTIV_ECON CAT_ECON Data real Predicciones
12416 5 33 1 2 47 6 5 1 2 1 ... 9 0 34 2 2 2 49 13 70000 6.695844e+04
26578 2 12 0 1 1 1 0 0 0 0 ... 0 0 0 0 0 0 0 21 0 1.379172e-08
37979 4 4 1 1 30 7 5 1 2 1 ... 1 3 10 2 3 2 85 16 60000 4.574408e+04
36525 2 15 0 1 52 7 5 1 2 1 ... 1 3 10 2 3 2 85 16 45000 5.329982e+04
31307 4 13 1 2 22 6 5 0 0 0 ... 0 0 0 0 0 0 0 21 0 3.226277e-07
46744 4 6 0 2 67 2 2 0 0 0 ... 0 0 0 0 0 0 0 21 0 0.000000e+00
25970 4 30 0 1 94 2 2 0 0 0 ... 0 0 0 0 0 0 0 21 0 6.199677e-07
31988 6 17 0 1 51 7 5 0 0 0 ... 0 0 0 0 0 0 0 21 0 4.311578e-07
23088 2 7 0 1 75 2 2 0 0 0 ... 0 0 0 0 0 0 0 21 0 6.817705e-08
7980 1 29 1 1 13 2 1 0 0 0 ... 0 0 0 0 0 0 0 21 0 0.000000e+00

10 rows × 26 columns

Graficamos:

In [327]:
fig = px.scatter(tabla_resumen, 
                 x= tabla_resumen.index,
                 y= ['Data real','Predicciones'],
                 trendline="lowess", 
                 trendline_options=dict(frac=0.03),
                 title="Ingresos en miles de pesos argentinos - Data real vs Predicciones")
fig.data = [t for t in fig.data if t.mode == "lines"]
fig.update_traces(showlegend=True) #trendlines have showlegend=False by default
fig.show()

Métricas finales

In [328]:
import math

#RMSE
Rmse_test = math.sqrt(mean_squared_error(y_test,y_predict))

#R2 Score
r2_test = r2_score(y_test,y_predict)

# Adjusted R2 Score
n= X_train.shape[0] # total num de datapoints
p= X_train.shape[1] # total num de features independientes
adj_r2_test = 1-(1-r2_test)*(n-1)/(n-p-1)

#Mean absolute error
MAE_test= mean_absolute_error(y_test, y_predict)

#Imprimimos resultados
print("Evaluation on test data")
print("RMSE: {:.2f}".format(Rmse_test))
print("R2: {:.3f}".format(r2_test))
print("Adjusted R2: {:.3f}".format(adj_r2_test))
print('MAE: {:.2f}'.format(MAE_test))
Evaluation on test data
RMSE: 9763.16
R2: 0.831
Adjusted R2: 0.831
MAE: 3471.25

Del gráfico comparativo Data Real vs. Predicciones, observamos que nuestro modelo predice algunos picos erróneamente. Sin embargo, el Error Absoluto Medio logró reducirse a $3582, lo que equivale a unos 31 dólares estadounidenses, al valor de cambio oficial de Marzo 2022, fecha del dataset. Por su parte, el R^2 logró aumentarse a 0,84.

Recordando que el ingreso promedio es USD 476, el margen de error promedio sería de +-USD 15.5, lo que representa un error promedio del 3.25% en las predicciones.

Recomendaciones- Palabras finales¶

Las predicciones de este modelo se basan en los datos de ingresos recabados en un solo trimestre del año 2022.

A fin de confirmar y/o revisar los insights obtenidos, sería bueno extender el análisis al resto de los trimestres y años disponibles (al menos 5 años hacia atrás). Además, esto daría una idea de la evolución del salario en el tiempo, dado el fenómeno inflacionario que padece Argentina.

Otro punto interesante sería combinar la base de datos “Personas” (empleada en este proyecto) con la de “Hogares”, para analizar reglas de asociación entre ellas y establecer patrones de comportamiento.